var/home/core/zuul-output/0000755000175000017500000000000015150352371014527 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015150357076015502 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000210753715150356724020275 0ustar corecoreݡikubelet.log]o[=r+BrEZƐȒ!ɦ[M cSy-Hgf1pgΔggiζ߷;U/;?Dެxfn߾n^X?|𒆷W̗8rTY\]f}u7ov7w>on3cvX~ygSgBeH,7l mM}*L~FjHevp뼝ca㑔`e0I1Q!&ѱ]/oN;W-{Wt=_U|6 x.s#5ΌR"ggóisR)N %q(×oՈ֛/̶]/8 N Rm(of`\r\L>{Jm 0{vR\FEbп1 FKX1QRQlrTvb)E,s)ɀ;$#LcdHM%vz/5o~I|3j }ɌDSd1d9nTwF%\bi/ Ff/Bp 4YH~BŊ6EZ_^_39L[EC 7ggS yi\-h QZ*T1xt5w KOؾ{mwk ?O1TsQO)rat m2m`QɢJ[a|$ᑨj:D+_ʎ; 9Gac/m_jY-i`)͐noNGWo(C U ?}aJ+do&?>Y;ufޕ+D`7Pa]Xj0ćNbYe獸]fNdƭwywOw0rjɻ,]LF0);I$>ga5"f[B[fhToɾgZ)~5ɑUIU"$`SFKa"j[Hp'{fȼ-vE,4IRkL!~kn0ߐNPJ|U ]]=UD m}}O-%UNnOA~HXwhO@GڷMVw dOox^-:}KA8玛7C;XHK:lL4Aْ .zqHP"P.dTrcD Yjz_aLm.x'})')SĔv}S%xhRe)a@r AF' ]J)˨bqENjʵbu'b߇ٜK;tf*H7(?PЃkLM(]֟-Xصp&NI%`t3Vq=Mb㸵2*3d*mQ%"h+ "f "D(~}moH|E3*46$A'>7aX)󇛠ƾ9U^}KmJ?t 5@հ1hr}=5t;J|dͤ߯R> kH&Y``zG,z҄R K&Nh c{A`O'd1*-B[aL"T 1dȂ0TJ#r)٧4!)'qOכrXMqHe1[7c(+!C[KԹҤ 0q;;x+G'ʐƭ5J; 6M^ CL3EQXy0Hy[``Xm635ӯ,j*X}6$=}0vJ{*.Jw4?؃ E"#1?|ђP? -8%JNIt"`HP!]!V 尛a;i`qCNG?UPԠ"ƎoC!0[r_G{j P>Qwf8*c4˥Ęk(+,«.c%~&^%80=1JgޛIgǽgr&P29LcIIGAɐ`P-\zʡP=]RFZx[|mi ǿ&Gi_owi[BOdG.*)Ym4`-RAJLڈ}D1ykdW׻"/6sJ%%´ƭ*( :B_o3YKocr ][um#?,_?t?}=rQv^sP3.sP1GNsmd_՝=z1Jid % Jwe`40^|ǜd]z dJR-Дxq4lZ,Z[|e 'Ƙ$b2JOh k[b~}!oѶvhu|8Qz<^S-7;k>VO><~%gi ˿7؞1*]h,*aklVIsc7d'q@WEݰLkS :}%J6TIsbFʶ褢sFUC)(k-C"Tߣvh>3fs䓯ҴgqmubIfpo$HhxLzܝ6rq/nLN?2Ǒ|=C@,VѩJ:" ء:9]))wF|~(XA PLjy*#etĨB$"xㄡʪMc~)j 1駭~բ>XiN .E轋RQ'Vt3,F3,#Y3,kJ3,LhVnKauomˠ_~g,ZByXO&Ksg3["66hŢFD&iQCFd4%h}̗Uɾ?qi&2"C]uo$.`mbmƒVe9f6NŐsLu6fe kىKR%f"6=rw^)'Hz }x>1yFX09'A%bDb0CPvw.T/ia v[|mꧽj -ݛrbH9t ֥S`Mh$욕v`;VI&^ϊх{211 VSxSew=?wW&K{Ll)HClba1PIFĀ":tu^}.&R*!^pHPQuSV$.7KMb.:DK>WtWǭKz4@Va3"a`R@gbu!_J5Ґ 3DrH_HI\:U}UE$J @ٚeZE0(8ŋ ϓ{Bb$BrW XWz<%fpG"m%6PGEH^*JL֗J)oEv[Ң߃x[䚒}0BOnYr猸p$nu?ݣ RF]NHw2k혿q}lrCy u)xF$Z83Ec罋}[εUX%}< ݻln"sv&{b%^AAoۺ(I#hKD:Bߩ#蘈f=9oN*.Ѓ M#JC1?tean`3-SHq$2[ĜSjXRx?}-m6Mw'yR3q㕐)HW'X1BEb $xd(21i)//_і/Cޮm0VKz>I; >d[5Z=4>5!!T@[4 1.x XF`,?Hh]b-#3J( &uz u8.00-(9ŽZcX Jٯ^蒋*k.\MA/Xp9VqNo}#ƓOފgv[r*hy| IϭR-$$m!-W'wTi:4F5^z3/[{1LK[2nM|[<\t=3^qOp4y}|B}yu}뚬"P.ԘBn방u<#< A Q(j%e1!gkqiP(-ʢ-b7$66|*f\#ߍp{8sx[o%}wS`ýͽ>^U_S1VF20:d T2$47mSl*#lzFP_3yb.63>NKnJۦ^4*rB쑓:5Ǧ٨C.1`mU]+y_:,eXX맻c5ޖSwe݊O4L)69 War)|VϟT;Cq%KK-*i ѩQٰ`DݎGu( 꿢\cXn }7Ҫa nG{Y bcWa?\34 P U!7 _* kTuwmUr%ԀjƮĀdU#^ۈӕ3ΊeBO`^}ܖj49lnAvoI "%\;OF& wctغBܮl##mϸ.6p5k0C5PdKB g:=G<$w 24 6e/!~߽f)Q UbshY5mseڠ5_m4(sgz1v&YN2姟d4"?oWNW݃yh~%DTt^W7q.@ L⃳662G,:* $: e~7[/P%F on~$dƹɥO"dޢt|BpYqc@P`ڄj҆anCѢMU sf`Yɇك]@Rɯ?ٽf? ntպ$ˣ>TDNIGW .Z#YmDvS|]F)5vSsiExţ=8#r&ᘡĩDȈ\d cRKw*#zJ9tT :<XK*ɤwoJarExfKB4t@y[6OO6qDfEz]1,ʹB֒H ֱw;SpM8hGG&ƫEJި_1N`Ac2 GP)"nD&D #-aGoz%<ѡh (jF9L`fMN]eʮ"3_q7:.rRGT;}:֪a$)gPSj0j3hLư/7:D-F۶c}87uixoxG+5EekV{:_d* |a%ĉUHSR0=>u)oQCC;^u'}8H0]+ES,n?UU{ x~ʓOy_>?/>l8MrHID2VSsMX^"NۯDc558c&'K0L /C5YDqNe~ض˸nErc֋@aw*r܀0 a {RQXV-/p:MP\<=<^越a/bz?ܓvjIg3MN4:]U]STa,@OKdĹgJ8@o2k'Hr~4Z(I8!H G8HNW%1Tќ^?G(" 뭗R==9!nKErHc1FYbQ F;v?ob-ڈFalG*rEX}HAP'Hҷ$qM9(AHx!AF 26qxCdP!NZgҽ9l*(H Žڒ;̼|%D Ɖ`Pj . ֈ,ixp`ttOKBDޙ''aLA2s0(G2E<I:xsB.ȼ*d42I:<ŋu#~us{dW<2~sQ37.&lOľu74c?MՏړ@ -N*CB=i3,qjGkUտu6k Cb8hs&sM@-=X(i7=@He%ISd$&iA|i MiʏݸT{r[j顒x.Ƞ"m@Hy_I )j|s#RLyL B EM;oH$$]?4~YrXY%Ο@oHwlXiW\ΡbN}l4VX|"0]! YcVi)@kF;'ta%*xU㔸,A|@WJfVP6`ڼ3qY.[U BTR0u$$hG$0NpF]\ݗe$?# #:001w<{{B\rhGg JGIެE.:zYrY{*2lVǻXEB6;5NE#eb3aīNLd&@yz\?))H;h\ߍ5S&(w9Z,K44|<#EkqTkOtW]﮶f=.*LD6%#-tңx%>MZ'0-bB$ !)6@I<#`L8턻r\Kuz*]}%b<$$^LJ<\HGbIqܢcZW {jfѐ6 QڣPt[:GfCN ILhbB.*IH7xʹǙMVA*J'W)@9 Ѷ6jىY* 85{pMX+]o$h{KrҎl 5sÁbNW\: "HK<bdYL_Dd)VpA@A i"j<鮗 qwc&dXV0e[g#B4x╙✑3'-i{SEȢbK6}{Ⱥi!ma0o xI0&" 9cT)0ߢ5ڦ==!LgdJΆmΉO]T"DĊKٙ@qP,i Nl:6'5R.j,&tK*iOFsk6[E__0pw=͠qj@o5iX0v\fk= ;H J/,t%Rwó^;n1z"8 P޿[V!ye]VZRԾ|“qNpѓVZD2"VN-m2do9 'H*IM}J ZaG%qn*WE^k1v3ڣjm7>ƽl' ,Τ9)%@ wl42iG.y3bBA{pR A ?IEY ?|-nz#}~f ‰dŷ=ɀ,m7VyIwGHέ 2tޞߛM{FL\#a s.3\}*=#uL#]  GE|FKi3&,ۓxmF͉lG$mN$!;ߑl5O$}D~5| 01 S?tq6cl]M[I5'ոfiҞ:Z YՑ"jyKWk^dd@U_a4/vvV qHMI{+']1m]<$*YP7g# s!8!ߐ>'4k7/KwΦθW'?~>x0_>9Hhs%y{#iUI[Gzďx7OnuKRv'm;/~n-KI`5-'YݦD-!+Y򼤙&m^YAKC˴vҢ]+X`iDf?U7_nMBLϸY&0Ro6Qžl+nݷ" 㬙g|ӱFB@qNx^eCSW3\ZSA !c/!b"'9k I S2=bgj쯏W?=`}H0--VV#YmKW^[?R$+ +cU )?wW@!j-gw2ŝl1!iaI%~`{Tռl>~,?5D K\gd(ZH8@x~5w.4\h(`dc)}1Kqi4~'p!;_V>&M!s}FDͳ֧0O*Vr/tdQu!4YhdqT nXeb|Ivż7>! &ĊL:}3*8&6f5 %>~R݄}WgѨ@OĹCtWai4AY!XH _pw騋[b[%/d>. !Df~;)(Oy )r#.<]]i-*ػ-f24qlT1  jL>1qY|\䛧\|r>Ch}Ϊ=jnk?p ^C8"M#Eޑ-5@f,|Ά(Շ*(XCK*"pXR[كrq IH!6=Ocnи%G"|ڔ^kПy׏<:n:!d#[7>^.hd/}ӾP'k2MؤYy/{!ca /^wT j˚ب|MLE7Ee/I lu//j8MoGqdDt^_Y\-8!ד|$@D.ݮl`p48io^.š{_f>O)J=iwwӑ؇n-i3,1׿5'odۆ3(h>1UW蚍R$W>sngir^$W v:?_ͬ5kݰw[!$s׭dֲcUh=Ɩ9b&2} -/f;M.~dhÓ5¨LIa6PnzɗBQiG'CXt!*<0U-(qc;}*CiKe@p&Em&x!i6ٱ˭K& FCfJ9%ٕQ·BD-]R1#]TROr}S [;Zcq6xMY 6seAU9c>Xf~TTX)QӅtӚe~=WtX-sJb?U'3X7J4l+Cj%LPFxŰAVG Y%.9Vnd8? ǫjU3k%E)OD:"Ϳ%E)=}l/'O"Q_4ILAٍKK7'lWQVm0c:%UEhZ].1lcazn2ͦ_DQP/2 re%_bR~r9_7*vrv |S.Z!rV%¢EN$i^B^rX؆ z1ǡXtiK`uk&LO./!Z&p:ˏ!_B{{s1>"=b'K=}|+: :8au"N@#=Ugzy]sTv||Aec Xi.gL'—Ʃb4AUqػ< &}BIrwZ\"t%>6ES5oaPqobb,v 2w s1,jX4W->L!NUy*Gݓ KmmlTbc[O`uxOp  |T!|ik3cL_ AvG i\fs$<;uI\XAV{ˍlJsŅjЙNhwfG8>Vڇg18 O3E*dt:|X`Z)|z&V*"9U_R=Wd<)tc(߯)Y]g5>.1C( .K3g&_P9&`|8|Ldl?6o AMҪ1EzyNAtRuxyn\]q_ߍ&zk.)Eu{_rjuWݚ;*6mMq!R{QWR=oVbmyanUn.Uqsy.?W8 r[zW*8nؿ[;vmcoW]"U;gm>?Z֒Z6`!2XY]-Zcp˿˘ɲ}MV<в~!?YXV+lx)RRfb-I7p)3XɯEr^,bfbKJ'@hX><[@ ,&,]$*բk-Yv5 '1T9!(*t 0'b@񲱥-kc6VnR0h& 0Z|ђ8 CGV[4xIIWN?Yt>lf@ Vi`D~ڇŁQLLkY <ZPKoma_u` !>Z;3F\dEB n+0Z ?&s{ 6(E|<ޭLk1Yn(F!%sx]>CTl9"و5 |ݹր|/#.w0ޒx"khD?O`-9C| &8֨O8VH5uH)28 Ǿ-R9~ +#e;U6]aD6Xzqd5y n';)VKL]O@b OIAG Lmc 2;\d˽$Mu>WmCEQuabAJ;`uy-u.M>9VsWٔo RS`S#m8k;(WAXq 8@+S@+' 8U˜z+ZU;=eTtX->9U-q .AV/|\ǔ%&$]1YINJ2]:a0OWvI.O6xMY0/M$ *s5x{gsəL3{$)ՆbG(}1wt!wVf;I&Xi43غgR 6 ݩJ$)}Ta@ nS*X#r#v6*;WJ-_@q.+?DK១btMp1 1Gȩ f,M`,Lr6E} m"8_SK$_#O;V 7=xLOu-ȹ2NKLjp*: 'SasyrFrcC0 ѱ LKV:U} -:U8t[=EAV$=i[mhm"roe5jqf$i>;V0eOޞ4ccc2J1TN.7q;"sդSP) 0v3-)-ٕAg"pZ: "ka+n!e߮lɹL V3Os\ဝ+A= 2䣔AzG\ ` \vc"Kj61O Px"3Pc /' PW*3GX liWv-6W&)cX |]O;C%8@*Z1%8Gk@5^NtY"Fbi8D'+_1&1 7U^k6v읨gQ`LRx+I&s5Www` q:cdʰ H`X;"}B=-/M~C>''1R[sdJm RD3Q{)bJatdq>*Ct/GǍ-`2:u)"\**dPdvc& HwMlF@a5`+F>ΰ-q>0*s%Q)L>$ćYV\dsEGز/:ٕycZtO 2ze31cDB/eWy!A/V4cbpWaPBIpqS<(lȣ'3K?e Z?ڠ8VSZM}pnqL f2D?mzq*a[~;DY〩b𻾋-]f8dBմVs6傊zF"daeY(R+q%sor|.v\sfa:TX%;3Xl= \k>kqBbB;t@/Cԍ)Ga[ r=nl-w/38ѮI*/=2!j\FW+[3=`BZWX Zd>t*Uǖ\*Fu6Y3[yBPj|LcwaIuR;uݷ㺾|47ߍeys=.EinE% 1zY\+͕߬VͭW_겼cazyU1wOw)Ǽn@6 |lk'Z|VZpsqL5 څB}>u)^v~,󿴝} 3+m𢛲Pz_Sp2auQAP*tLnIXA6L7 8UgKdT)*7>p{Pgi-b)>U6IXabPde Ӽ8Ģ8GɄnb'G ֤Mcv4?>HC78NE@UMc8>`TvZ:}O wm_1~lԳ@4mns EPeFT=d;d΋Gmݨ{poF4I*(AJlɖdYF,$b>ܝ>CUHCՅG$0T(ET#ʢL*ٕ64BCUJ\(f ) V1j^MYfc*0+nϒ/yTH! Uaj/n">Ud.ZGe0TXMy NՌ1T˛b?TUiO"kҟ=NfRvmK$DKkP-8%$Fc$anՅ"wBohP&sI]_缩,%q=qbz룎L {e,ExKiDH J^'*b\0{ KmwIYQEg>!=cHHh1X2)cD"V7|ؠL<4|Q{X煸=XkT6m6d|:2kF ʛ̅!P("Ex-W̒jTAQ5˛'FRu)QUP_2?UK]}GBfu]T'wt7jR겚RU.қT'8/Gsx$W-pySґ|MJuQS$v/Y{z$<JT |b%䣸/q'mc/,zZuݓ_r1 v1|Mϓ9s–Wb<:Q??e e72)HYף:y>#] / |) ٞt{cPؾG|9Oה/@=2q %n ~̺FApf 2 rkOE]&!ay ؐ<|U& oMq낚iN."Cv;6cx:QN i -ca9ysT U@@ݥ5)J>}ՓLgu~k(#UWGh|kwswdCߦܣ3mQSsV^6 k)ojq--q} 'L>Yj 9dpH|ҟu7'wʕC4Mgw=o'g|x$Tǫ#y(Pmo; +fS$E zSsEoy!!ńNۊEP`EBh ߕySgf9NOROZBO0-+//ӜG'*X bCq8M%2XUDXOZP0|tzg 8 I6Ae}kO ?'b?4HfsjUl8`O<\bymx1ov.;#hX jwO{QOQ˔ dz&<N7( ?b*D8GQH<ᖴZh[ ΔSK 5=rǍ{yn6or])=X"kAw񆗲ࢼiH L#d@9ZDgVGlaîvjLC٩8IH,*T&}:?i,H̚2(8^%7 0KA{T/M4uQDLD!Ѫsn%1DcZ^Θ?OL=]v(繆rvTmSAf7OΪu0{^FW` !"a{QD (MeIrfDjJ!G^ G9V;@Q 5pjDd4Yc!I:Au,5E sۻYTRBzvԡO +hҌ'% ͓*ld)(h|®*cIT{t;̦DG0-XʫZ^Ftt2]EdckNVyc#a$Ȗ>OCcLgF=u;Sp:<GXW^YN 8 ZˋoS8j[>4bN!&WX]C EWDjR+?i7"kwey8\d[A,[قR2Y~Λ%M7w=uZ[D5= D &Ty-h;m@"Tq6==8mٺ/`\lATmx9X8M-~Ȗ+02%o EЖ\-KʽlZ*^P(mZ[,.߾je;>r.H.2 AdGՙ; KOO]j .;m'5\Mg臖zWI;e\AX*V؆ @Wޙ|7uaϋcуܩ&rPIw5-+ae66PSw(83ˍVYsc9۸{æ;l0tz{ByཋF]G.(޾7H7njړ)j`oM{O.Τ?~r\u0}@Gbb·A^gou4rc@q:;_ H5LSphBl=M ukެL=~M}]L=sYhM>$]B@o01pj8rc`& AԣϾ]aIϸ>;P5Y(gt_R"{3U' D>A'SY~{LO1{tU1Bf2̠^z/Wz̈iFF/wA;KLD # j +#^$d$?Ӭ_Fh({4?z^;$ɛSD}X(zx $Q8SMR[K3~_w%mJXvY^kSĉ۪IE͉Dꑒuo7@JaG6ٝI&Hq5"&?΋ƽ}ݛh2c?˻6m*i"Q@/PD7"GAxM{qCt$X}C71ph/<p0~űAn7FZ;.,Zş?QDhpa -kž\ :]@I?N $`]gZ,AIB@ƽe0ەl%'td=-xbadzn;Eq܎88u"9Q%[fusrHuĂ{oBD'3xλb`\xݲh?* $v}pn!AX}ZgB<[{ *m; ah# ޭf<]6k//(Zv茠DGڲy^(J1Hؒ @p֭de߲jg6$65!m 8c` v 4qtPl#6p)ot0Bt)Ck3sGv*Ln=OV.@b(b?@u#<n-fuN;F;QAt%̵G&CP`l@Ŵ`-%^%A\?;J $f,o xڮAiwlwOЎ 0Qf3JR(G/ο:>PeP) {D^-q`rE L;:/`Q?a>XC~ LzYEi,iTH1" FYbECxpq6Xg,OZٴ փXgw]oT} ?sI]꯫>Z!ј)qxڬIQ~ oC4*\O.o+)5t&OtJ9ek(o#g8RG𸼸єV'|TwjL/8֯_]רSO.`W^u~/?e׀`t{Ppy[ ~6M뎨7^: ? 4&gr!%0`XvԋF(Dp*'FC ens8=,\w XŠ( 5/Rufl "y~#(G73i=Ms+s֜ekmRf%E}w8>)\h]6ab|}YW{+[6=&̲M@p $Xw\3]F Qڀp|8 ۖT ujBh,[S"g1uAr|V#r_!a^Xa 'l ^z IP="`LVꄁ!6N\fz6.ͣОMAz^ |gV{tYx4FGYυ#L(p= iiC ֧a쒮f~Ȁ^8Ñ n*zJvzMBٽ}U\UZӞ)LTD>2N>Ŵ-d십Vjy ~p^KޤW82h)긷\rV$u9>1@ۂֱ }gۃ<$ &m/?`oM B*tBҧJI(݅е.)}D'*}lҝucwtSѝKUjIJieO#=R*:#ESJ9jCqD+i FeZV$*SfJ;mzHT! -eקn,{vӱ8w7?GcϢ,OwUShF^"2R&7( UalA~-FN/*Ҭ[]G,KEK.8EӲX"ڷPfя3~$UIc`jiq)"[iGM4DU#i~*SDCu)\D.h5E|#@i<# !=&*; 9 -iYk$o _8=JpJPqZ[)ɟVB-{*wgԖ52¥ hֺ rSE:c)4DEΆ['λ/{|TqP{ i2_V|>79l{tѸ~?0*ܛHE4 /?qg-,wCa7IE2~mr/7iUr{4K1d3[iݷ=hu)\ ޳4Q6Q} H3`X!<'T= l'rx,z,gF<<^Ė#;d@մ('2L",;+5^ Wڋ` 7wk*Hlh:'f͢"9ɒM6`esm/ &//&pLm` vU[oKgdaFX=-E}Gá2[V# Y8+:Lg~P=IoTQ$(ЭWbSkԾy-H? i!`ƴc)CY珦;ylx%O$ Xs9Щ/7ZF3t :SdCSE{5%^T.gr .~ÀhXʀ܆nVxP{>SP-'\>w^AоseTTn,'-e=9n ZJ2V}Vkۋb|5Ɉ[}ldqafa%qExtFvEX) 'qr*0X87<}϶*آ! #=4+FqJe}4^g' <:R /7x{z~Yjjdh{gv )Z[^ENЫ&+ns84{(_j[%|:@?:xkAL`2ھP8fkDW2:m՟oe j\ 򦧞0hID}3CHڻl!+ia(h.j֓$"Qek1*K a-2'*3Ȟί263TZp1QlE6Ƣ'\YibVfB6jw(n7AX q-H  ^M+CF扌PH-E +kZS2+M -~k8މe/8Lps5 1&o!KEG<s2A_H!F#ՠ(ꑵ ɲ-]ǤbW:yOI.Fܾ5ϳpSFc:UFX|-!6,BgD=@d8É^Tb"Гۡ?=D|ҚʪQ5ۃ$Kͭ1l1|=M|>OzژվZf5:67zV~{ g:D̰0KC|zMbRG rQF;f.3”'M[%=h4CwqNݿɛ$D9dCcI&{'}]\px0cB궑6m_WK2y6_Cw_l>NQ lIgEvP~ fOtu^=$]{_nT%ZvJ1^4z;3LȧD"U$-̿O) p2W޸ꮐx:qx.޾ʯ(>5u'?6]Džvυ_eX,m= F&@~埦9>?MYO;[o_smjyVyݓ~ xN#3\J?'-|5![ڻֳWsJ#m8+9z{?}cR7&%_NN2fɥ3Aʰd,_w7Kpұ0yV:fm&I+> Wڱ й6J9TZLN~ ȟ\pp6FtWJq )pӎϛ_.8 =ǘ A'(Z?!TX_&Htx/874luP ,B Qg4EE<K8KrFȢz (~?vqY=K/;@~>Z$4sTL2pSM>'Q`vM5~⧅?q&M}hSvuVY ^WVQWh+*05AbtU}N*W,(-D"^^HhΈb?=>QAK%84qP9:HÑbg5.8J7bG`1wCkl\ 1ȤX'ק@PۀO\RPԠlp+&/V8o%hrӂ.9*x `g[Gdžb( <}&$G%F Jk,$@9@Ỉ|3E=ӃBL|3~<2>Q#/҆Ujb?~KB K͹,o +F5 uј1~DA_k!fK1~tٗcď-Ŭ=l,! rx t/u~ Iį~7;t>킝狊cĬteGkC!5̏eɭ(yN 1)xIsK?𾖋YIͼb6)4M*reՅP,5߮K?Wb$'YIt<`?k1@rpd0Y pNlbVa >i2ATECnr͊{Em%6sM% Ս[9MЙ  }URF$^aWX*LNHi^g-uĎTך>8jMoc;'o8Z]pD<$c PEtU|Ʋ3Rr$ f`{vteκx[i/d) QD R)EVϠ~QU#vkw*XdY f<0D`4cFgK.]E980|KCCZĤDߟ `s|.๱1$X(WR;=9788~38 N0D v:?DV0( nwͺ$#@G1ӵ_[(MX*6<ẋHGVNFVXӆ ܀/*#{0GtEG10k"Q~ s|kǦrs|Kt?=f>5'iXj,;Ѻ B)x*hMrQS[L~MJuQ؟hXD\\ i0*qE!)dӹk;t3%IYÉ4k Jx!C_,$x81X,Z ߸Z?[LvҍXm҂AjhN y1z̐s #NΟR= N}K {jb* W|[ V/^@ѣX^H~^wEOwB띥ERN .}VJYn1e,Zgt?ꛈgt>j 1aX G0fDH>ScR08fK].I"vq~1o7z 2ZTrXċJ)b{pdɵ=7 .\ąl2莝JgS9d߰I=.8ה` J Ҝ( Z# \̪.'h>wkU$"?)\d3p-|IpdxzڧfڂGeo 5J%9~mi}8-ws(v>wQKVS UoΎd~es&hKtuA - KY`RlQZ'g#B۩S/qZ,:SJ=Z?wXy2F90U/**vqz 2P_L9ƤA c DPcLF僡r <պBW_v嶓=3qmyVy~E0Z@xnJιBO݋{S6MէUn;K#fY%_k.@`t;RE3.&%Ⱦ Gsg8XZni4BLzni8\zy9XkE fM- x`[a[3KQv'[huh2 U  }9aeK8^ะ^{mtiz5tBx??5jM5Xptq˫\)~jꂣH RsjݫVN+ K8C7|0!|,v : QBđ`G.8]+$<Ñ"$|1zhyvLOۭ7`$B`hDe! \0dA"_Ń낣<prVo8xc>iI &N\⒉ITs]:l?uAӸEj*~ ;np},H^8{U(͢5}m8=XU#`]IKZB`DmPKK]pt#Un2NRog5m"riģJ+hF KYgt{}5o{gxPcRԴosa SԁzPSrȌJ`\GMjD\ڐBl!bQ·=X]pREbiK×:vJ]Ɠ@57 LO6/G. se=k'eOATrbl)`qk GtDF ^ثbyaV4g$x`[]:cWe 3 Ox(Q%_KƋ"&s]?nyQ1wva~~kk =dw9,+WA׌NI1dr%S􊰄vQĈSFDaEtl pwpL򦾱-B͔ Atɚ/w]p:Ȝ0BH[ 1$XY鰆^o׫}{z"_\S`G> A5qD#3<3Re]J'xp!]Gyf kV(Q D`0\W =skp{ ^~_S]7L%9YMI]z f]7ѭG?V3m'ӅCk/'kBZr'β>̞ˋ'oCP(P'J ɣ7˿蚢DGףѰ蜝0S8ʎ`ř7ƧU71nF y~stԉ,I4JU*; ~X'YgNmYA^If\J6  JwG WGuҤ;_4`U7X*XpyODY?$Dp@)EzYTZUlj8}|48ʄ|(XiT{,y,HuvDޅfM/o*c8Nx t&ޥ9v6 +=~uW4Nq(`vgTNyb^Uw&@W#_O^5[7hZ;&m lls H0K\2B _˕u_xv59q&Wqu/º"˝* ,pGD_WbW7%ߛ!G"q. 6UKT}"OW\{=S2O[CN+O c3fm 42ˣ 2[N uEKsr6?=,m;_4Y}gQ@i@eyl&/g66{:ddg#Vo?3gk+Z*,[ &ƣIQCxg$F]ixys?(S%غ&s m ڎƒ @.f 6-F64z#Z: 5_Rw|wXw7 ]"`obH6%êEaj:-_2w@ֻpRx|xM1'[UxK> *9 }i7[9g@hu8 t$ tnufC<+ĶscU(i9<~I jbXɶXxRԶ~N=OEk8}F?˂>6Zpy|b[S8O=F9/QT>H]z@86,vTEciK2y oeզ>u1&=&{'?Lp@lϳwD-Qٻ'A^IƄ [ǰM3tsA{tӞEX~f$&E`۳ޓ+q[:x+: kG=,(9QuP%8e{L+^;z9W6+ۡ(1DiwI<΋p&go,Ͳɐάƒ8uq1rIRxJSaicP"'uP'\6 R;S<8J=iG^6C̗Cgy@>: }guqfwpAǒ3u?祘"GM֥XʔBÑaN: "\8F4.&ˮ*M;j$M;p`CfUdSYdsA8!`llm]oAKh%*(gL%SʹLM%Al8_YϚvHϚvpñf`apNR15:N8Mbp~W (eBd"ZY2vH2v8B(Qަ2Ԛ<-q0g$\Kvr !Ut ˋ36S/EGxm}Jb{`S~z#$3 >XH@b4]#Z%5k;Ef(| Uԃ͎>NVjBfO)VOF5>\>`c勷h8k](( ~x (R[3o}? {5EvpyN=0c5`;7஄mXAcJQ1aSELլFUjGG~OAޖ K0pѧrj tA Tk'5f\6ٳ MGx6+&f\6; MsI0B|@~19g,Y>btBC)M8(`TbS3M -bJc /Rer|s񒖘Ix-%뎡[|1ti_xm,Bͻ ^Ƈw`.)ؓSL:= ^+(@erm=G%fħX̋Y0 %,:dR"޳kLsb=rŞEm}kYՍz,QHw%0܌U-3N9pfvM05>\@?}d|лd ]Ӈp6 ^檖;Nh=.].ۯ/ dM&\ ;z$%,tIEWN%\<_>7\0mL;f)5D˝±Zs+B(&1MLCJG>f 5y}`&B_fƈ}jVC `&Zzr9kK`)Ķ-vmXKj2[Ak)`ujS ar'V% 6:nƶQYOf,V1Xi -6? X? W?*k{ɲYXHN9c:8i-SJJ@0NmTh)6dZܥܒTIig#8ia*jcM M,<҉kPBck} A\ƩIW/BM%ܐ謠Dj ìYn&ftM,;In?i ̀E'4RbaWIi) !eD#rQZ iHj3nSF] G0Mb<1 0a !LH*3 U1]B0CRC.IèYea#]IC@mC@IUbh2Rb ODsK0;jyf PBiAS dE1 W' 3,Z% `ch3!+ tQT}P3b>UT~3Ȥ~LunoLy=BG 4Tx} 9|v>zBcRe$1 c"> N.^Dg>{fߧ>gn;7実9yzkh;1./7 :ˡ/s .X ݅* @)bwz_@'&/F?e@8Ad*CTaPTb_eWj}@#/|9>E9^U:9v+:xDpP/w.@>pDf 9wOwefJQ~^X\-,XC~1ƃK#h ƃMfὬC660p sB$TY.&cL+d8ILK/$z}؞IIK5f8qއh䞇hZGZF\Re,IUu[0r9e*QPk'tKbSɱ:ų[d8;kfRTjW)e1qLHk_w 'c(5BT6~YV IJYe򘥈 kmy?^.ضHR:0"[^IN IK☔t-yϙs(PA%aL1&EBDKK5$$υVcDe@] MB*S:+03qJܕR6FԘ#` d~(U)1RsN< +QL1NRÕX:0AeL8II#/è䋍&DqBc 3f8S\S-$R2WMfͧ 2R\2ZŘ$6V&8 &2)3z^TKX[v%nW*)A .I3Cj R [2TFTaKΰ4#V-E~*ɱ`ҁVZNa-Vs/1Z22aE)׷,") b \v-LpvFy1  ր;9 p4ENH0$IJ85&NĴH% Kl4MtF2UALhj LJ;u5|X-:,ޘ_ #D{:?ѡ+D_C8C $ %@+e̝VL:0Tw 3$2j܃7Jbr98?I5{|Mac ")zsa P<` Ne=#')cL ]O%ę[J` U,H؈L2A5tߘJ )?@f6ΈR+; f0C$8v '@F&\!+jm*A3$6s[L;O$wZ6f4TU/Ll4ΥM ,o 5ъ 1^8qySkӫ\@=2 7-?X>%ݘ6d"R3mep+ьg0HђLZL()RB { HfJp+ҙYRIW1Fi*1&&)шIhʓL [VH Ŧ\YlCWfAː 40EA;vы ?w-Uj[QB7t1J/nE2oVh+dmT5A*0ZP8` MkQ`B>TZ{& \# u`/`iAL" `َ etMQP]l}`j' k:WVh]jA;P\PF;mHMm6LBaStCaILq`K{#aN]D:yMF=;C+toN[y7"dPac_ oZ ~:(rZoj1v-.*WEYd\]q։rynf/LF`K$ݭ5ts^0%ᭈ`.wxsZ굵>ډ+Y-S(Lm0C`%î4z%%p5B 7hIVsuC&Kwcx1NѪ჊:q{Bmv\nyÚP.:a骐(1XBU~cSI"sݐen~,Ƴ (G[ZoHJ#v \jh. Օzˮ%H].鲈jjMƮY[]62-&q]3qނz8ߋySrWv]^=zQeFtݭ!̨tn㟿r$J5RۓL>y5&{hBVjvyknM(ƪEvJ>"Ö٥IҌE!{\r6(J!wm*<Ó0"K Z=&.ćV:آC_Yc~ n.U)P$rDɇs 泳hyʚGSsI ; ѺwǕOG:TJTPFO$G9 f9Vp@@>$/sAv|}"2$51;ME|L@N4wJG~שFo5X|WZZi}KyEɇInCۉ/t0vnnp06>k9z0(}_&Š+1Ǔ4W@;` @Sp:d<,W(ЭM` cڄW$ Pҧ<:ra۱8LZOro:WI<Դv[rB Ɏ[QE)$g|ߢ 3Q /K9 OsaZӞNAZR |Xm/ VP'k^9=¨q}qÎbM8%nP*Ul#zqf ޔSk//9Z%0Q_HA%VZ)#eDQ90Ghl07!~}*d3s_~84~iq`v/ %~8u  ̷4Mכj|KsWHy;>X Qpf."Cȝ֬; kҳ…]AqҟF%X`ʍx70 G!>Mh@-cUBMVuvQW4d6,<V^ywV)|p?vxE .%YH*%drWi.[En><+6mmbӋ-<3݌j٬7myI^gӟ]|H3:w"M{~WKT8'Ex93O G)Go4aiwbG༰;xeDZ{v:`lUkl,8Y@_F >1wE.G}xB6JſVMdM9@g=/_? #Dԡsc s!{ +KJu2" ƭA2K558cTZ!("BN{HI-5< 6]aH?SV = UV`{xmk{ Lq>v/ROkm~* l酚PLFTR^)Q#$?|D?Ukj!>ZT* |l;u,?;T_K Z|K7jPYHO9N|噒Ws8uMl wpeFU(7i/]Qi /2/O;MEgoL|~U=QE9:غE0|OZd7VdJԣ V zoa U(XJ.4A dS|`ZvnSc\xXkNm zX/5g994gjݘ<͋TZ~)  {%ayr~6z}oLכ# d1n?ՄIyO5R`eoKI_C[i."wW0T+_p9.\Aj}1i]R.<&}ߝ3њXLFZH_š !I ;x< AY.G4R9F>$^"<7z3hM~QT_ooV H$(qg ϬY&ӈ&~ze6W]:tUt~*Goˆsbm\JA^^.܏GZN7}c+A`k$[7Q)Z(;*R.E^_q%@6%x [揽jHhK S-Uf?jv$lК|12Pqmu`CϨrΨ}ЂݘQl4Rb?(K~ԑ0Wo_S:WGS_HkZ 鈒J};.vtvtjGGj6r $2P6a8p6BF;%Cv}Q+I1mCܫ&F"V9/ԄK5.Gj_,$P_/P%擼E[Skѝ#UZ<>V0OOv笌?l]gv~5ˑK×~IROD:A]&CHJ<)Xa8)VY 4` jtˇO W?ֱvjŷ̛-~ԬFHɘW^ka5`gt_ys֖*׀ K.#YyhžAQk8ZVVTy2o4l|̜y7)F#4i|̻HOH SϥoХ#:xV1]y.ិGѪ&nQ%k6ͻ8™G]ư@c<6U/43d;zhޅ<49lE _1Ummz!Abpa^zH1u\ey)c t䤄+ LWǿ)Fݹ&f S)nÙGf7Y5àj1\J @|Tg,E 1K7sHDw4 ͼ{pſzAUt|?Ћfw4/ (Xerr3`(ZUF!d [R ,Bqw9{h=k6gٛQ-. o7;)}CGOd`ie㭫-!+~9F9+D), /BkXRc}KX9糹sϟ5kn8iwwhU卸{'H{nR4 4p0ScvW3Owv|nK"IR? dͨ~>=2Zpp7YiSEӐZXf-DNA7Yhy,r19{h&zkGnL<^_t ~ݛN=_J>lOW5V9dU3>937ZEX ; ۷[q)bѿadovLhJ,"g:=H# =s u~K-yL㳆p#l:f2s!UL7 ]eR )Z<=O,L%_gzcPAļ9۹1cSr+~A8|湘XNf^n`GN&`TD; %#HGF ̕DqD9(#Lw~({&&ý޼Ob><% gfM1i#$lH6*Hre(6C$y/)#gr[c2`JqG;㔑1V%kb:B[ ӧN}_iamaۛMh1HԈ.ad5hݮv 4nΜdM8ũ|H:[U/s71?U{+[ucƅgr L&L#0L7E8ރRi٤ޖhQE^Ѷ0UC.ޢz/PBJMmG`{^LwZ[T泌^)ڞ< R "ړ2sE^H^!TCt!eWUZdo({ 1bNx E{y}@JB+HE{SBBUUPT"{/R e a=0%'XE^&H1@E{WDT#<\@5-w)٫cSf%[d%xH+w4-d/:r7C7\>TUr[d^t?SQYL#SVd%^t:R3*fZ{PKeƒՋjXZzP *H HqP5/Xs]U_#do=gtMs#m>@vK\mu_}E园קZiR__ߪoQvހWmǟo/WJx{^~rR75ЃѵTklvN} R!BFrjoi[iio4Cp`nv* pMdдiаGyZxi7BPÁm1SDӓ<3\lH> ZbY}t9 6md5DqMBR'jLgz c=瞨^z SQ۴(o:VRBiu*5ӹ%pA*-BTCI1XH}QMYQaDcYەm&GkUs?UeN~ho:?m=3-5jqo*V;]BE8X 6}ϝ)Fۮh,=`Jm4Nwt 7DQUFVCWǾs66ܔ~cڮ3osQO=< z*[P 1Ա RAMT2MhiעQN;u1GP{ukӳx0р/*F*jRN/zVd 8f!lq~xю#xJF}zpu-QE}beVF#pQLY QcIӤE}3O,}s# Q(DYП4#Y/\{S%S BELY Cw80@` xz@C Uܨ/ku ?|P@rW@6CFVCF}Ng)g=LS)+aSFG4](!֨ !~4,gϪ8k:pI#+!BujGqF̀0!!ɘp2L`k_W8ͧon||Yc[1ؓ XKV)Ě#4-:U|c|P;MgkNGl!PꤋEbFCb>"Rp9ycjD9}$.E ./izF^R#шXȲjdf`{#@ i#9B1dO_26f1ri##e|2YcBɣ8y1ed5ܡ6**g m@ l(C\{azȁQ3COmiq(9ļ#_BCU+Ůʦk6hb EjWjکUmj^G"3␜!`YMYYit%CQ,6-pȡLղl8vax.GCFVCK. JyD|ڿ{ o8(nBoO7QCAWQOOWK)J3di#){J]Mt6vZߔF18`MS}Si#k9;VprR3>)g")Fe.qǔp8 )+}Wqs3>Gפ1D.TGȪHvs?"8Aqb[bt9FVCb(7mQP))GVs?& v"&㝀࢘AFC.b*B'PT1TܓSǰMr bj5U񌪨|Uq8R猱 (CarY #!-vPrJSWTN쳋_QH*eSIQG><^_xk:,%(D(eP}WGA!FVC!UZ5mmI)z2ĿD #!!-g.WE,C Rxh?_]̢i#AuPk-ufRw]7-QI RCA[\UMYUE[euPC˴4;T.HW,4 ^Q٦.8쩄nwfGkL-A1&b!JesFVC1!®шbv8D |b\SYs,0JXb($ 5y}n|+I!aPdZyAcUMY UE@]}GCvM(\؈q $2›$,#! \ti})'F2d~~3 g"%F_CT'@ '['YD1id5D,ޱh52@ 0-!>@*9 6ۣ+cT_W7o~*D%Bt<YF)qڪn/2)yabھ^fmI+q?6rM3fy`D^7[ݎ۶fN%SbǯRX@LU82\r`N8I9ʑ)ӚrJ|a#4đ9GlɑcilX~ ߀B*JV+|9, @llE ,/wm0)Q6(a.̑ +} S!.gI$n\xՅØA\P*!6kGV0nl\XRźp J4\ _€0A*O1 ARrAx+lf` +xV0VQ"%$Zt@#gҊ >\9H7.եSx0X)6J%))dh|\ڕ~`VL.&Z7(3TJ3AD5h $!W0fZ-EFoAB!A2NDA h&X\mipe sz(;Kpb#ƸvFǡ5r0,iv9tp#\%N5r.-^C#ff9}C7EH\@;<8S8HQL\n! r$o`G(Lꎾh4 (xFƕ1Q]0 5ɕa\YhDk*B!!ĕ9WV.v=bWfP?4o[,BD JIuJlmQ D`kgTD% .' jwǖiӻal;|4h8Z;wpԍ~u3NkBw1scM|V2R% Jd$֦`߽8N oz}Ϧ|3R-_{r$Ŵ S<cf+˗pjWj0`ZC;jΪcE%DAâhfE`Iq@Hn&B* *[yfV+],_8z4{.GWo(xXdqpRE~}nn{D]>x8KlRy7Z dkĭ! >ZӆWЭVdJ<|gW.w)'$ ,6du WRh%kf)"oN{h}~:8N΀Yfܦ'̓z\)Wfn gE?[D@/9icS:]0{!⡀JQ6! F0S.]" {ۋLb% Z܃!i[}LS'kxƱ/E7n f6S͹0$o+hx k܄0 Ra8Uٰ70?L2F2 U-a~{X&_OdnݑlE_[@nW^E d-c!5c~Ex0\sk7x:^@epS mv\Ӧ$BVgN?r5!DSEVi H\"H2rkE3wFН0>Ps V· bJ8^Hu.<4! alnBsa+(d~,ˆ5IuT[dxyz;[`8KMly9xj.ZyOkTr`+5k06 ƞb"BÀFh PVgR,!ر^҆/4c -HLe0?{Q 8[`A>6!$JEZSF=(e4႐3DY;,Um!?V8:ՒF qB`jwأASf,[=Ş*{lZ|2W鐥xN$__iO >)wiIl6&3$e)k˭0 *6IO\$m6IMғ$-R0mK&`s=x&6< }<{ K]+ pY ;+Q\%B̗B3P>/mMF6w`t|YK0s( Ŵ+f^_XG O i\M+ߔ_d UmR pP%3;~z =pvrk$2In:f19bJLIMV?Sxf$R_WG:ɲ^ovJ\w׽{05g7"Ji?c;o`# Xsw4b Jю`^ȪE) %&OlOM%+;M9z"\c NԳa 0逰F+SaVy4=S0,|{;Kj7޾~ʈ YJd3@H-kX Z4ET"ˤI/>G Sc>hA O;Dl`-Tz `rQTo~bOϲ.|3_YOS_=ip\a[q |hktk7~^@SmhT,8*Pck Fc:PcX;P5& B}m̩pFu}ףoR/[[@VM7\瑅>BS E ?&K-GdiՈ"Ч  k#^|9{?MAU7啨ҍ,#ޚу|;<ٰ[:]՛Zb*ձṞsMqq-ͲIIȒrg\c-)Sq,OB 8q~.pKu翤\ct\B80 $0~I080zA D tSR03EUӂ/|_Y5/:WPr|wH9hd/hvRqAZ )!G]^%FHUIJv}Av馄ع`d 5X SGp-БB:b-:Ĭ'ѷbꉫZ-v!& $^jBpSciEo9@ԱJ8Q@`ALN\v2dq: $aXfof1ݼˆ¢!kEv[^Xb`^<]ݎ>}: N..d5Ѹ!ھ"hJ`u &IAÐ[H|[Gl.u+2%6d<2U/cԴ ?_fR0tyk= 5s7f_. x37ٰteZ静ZbJY`=:2 'jIˤ8`Ff Qfr} (%'&wZ`?$rIJO# x$ \6/jެ"iM"Г_3)ِ@pr14ͰBJ%q: z2Ҫ/RyhR)610U9R kk\2h݄ }@MbkQ+pm~oS7Ǎ|ۏ; %UjS=G`B?vByѻ#헙:*W d@MFEh2&V4F޽uz' jt87#;ɠׁi5FTDuXxY1#I߱sk1\ō_Kpu6oo>ufU.gYO)O|~"l<?/?5wvމn2 [@FUZݷk[(aq~g;[N\0*`%8]yYsYuu^pKnSpGEo"s8)›g_/cì:Ƃf(pr&YwP?|wf9js̩; ѻ.b_<4"gR{&wW@l7pэ5L%RUfP45$"LP[S3  6%{r!_ø5F>xi}+ ߍ;0PUo/Sq$38AHLYfpJR`96S&kYXjae)Ju,1RdRsK)IRJ2 7e$>{6)sB4dI2i3A cN4֩`,7d(2Ɯ4> ![eڝ 1SJ+5ւT6Qq`h̘unieibFSy,dLViAjMY]CNPme'iRda$KI-9Iq+3cʴӢ`&T%B*883 cF`13 qB,1 t !S)g{PaiFkX.TASpb }<fe5Cpg0T0%Yn:Kb ft{ϩS%<}eҾ:|bm"1~9 A^%V2*JyNdU{'1HrY~Z}n| /%b'P4W<8: Bhku I1Ғ<%XabBVCIDDd/ EvQk.&Jeb ًL`џzҺtJڑ68ІD|[ZFP>IFƌ6Z(&Q 0iJ!B QJ۝\P]ru#Jz.MV^uL2`F'Xf׬|O86/Up;8p7k`(%3XbM45%!Z4@ "5-*JeVJAJn<{ ċ *:v`:GMۀ8ڂ Ug'eSU_}*~Ǣ5S"貒A";>Vyh {,~d*"u1IWCj,׆XkxhS(cѣP. 5HmA/&9І(T&o%P@P5]L(},C)9ir @NA3uA@ ϠB.q9펁FڜI9#D+VyS%6(A6Ʈ8XTЙ4I@R$JI>2@?A jAC("224\J>CIisQVcD8"gD(!cH>kJ fc͉~ MG6B:iˢ9K4AXFJ"5@Ј iTJ9SeQ7-Yh/I?ߦJP0 q.[ih&ژGw,|ΏǴe\ӔI`52:[4quJ#48z:o6` > D2f Igyj Ad`9@II[b}z1ҵ1h7tJxKTMМz( Zy,"I;%\ uBAvP ,#)J>J3B>Nr5 7mkb[662'f+B1r\%A' W̨#o(:U C ; =PɢRS,QMDt z ׁN +~x΍ڈR&G/S[. X[uBύƙZ{XV=_5''UiLȤO D Քq'k-kDv \ 33e_j5Z!J)-<VPX?i˹m9L:CZp8E zȕABbx~)Q#2P/'՞ 3vQP mJMq'I!*Mk. JCqc3ĭbU0\|Xv)&T(,Ҭn"!:.H,=uQ@[ U&y.MtE#B (;c7`j?nz/v.n\.כ9N8:2LሜGNx$6^{*Ci(6Tf "0_J`Xd^ mF_K<].?l;΢lp6X׿inqrƙN۷Rcă4m6pqv|tɍqU^`aWgóSu\uZ_Q(;Q7:nFnFnFnFnFnFnFnFnFnFnFnFnFnFnFnFnFnFnFnFnFnFnFWk!K4+^u1괩<:dFߥQ0(w7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tΫ5Uv3߷S0Oہ6]qq;ݰgt&1&7/ip6#>UBQRvPܻ_\JI yA8wF+P_ (IOy՝Wy՝Wy՝Wy՝Wy՝Wy՝Wy՝Wy՝Wy՝Wy՝Wy՝Wy՝Wy՝Wy՝Wy՝Wy՝Wy՝Wy՝Wy՝Wy՝Wy՝Wy՝Wy՝Wy`;ܓ;U?.|fkfs߷(?Rj3-?^_?=Qkl&yӲnz^//gm d n ;+m7>3k=記 X^kLB@;PPgVzhF, 0@3?55Bk;l` E3^w3kXt gw[pyyck,iy,n9-z\ڏln:`CΟÚ=ںJƪfV i.-5%ā XRy? % fsPڥ~`8n3)HgVIK|%`Is`,0<Ff9VZs ')sP`%4B݁-ΆVwSE,}NjpLi`mv.O_t^4yF=3f3KJu3|䳉YlskͦZ (3`՞`ձ@ ?R=|U{F6@uLtࢠ{}Au%եq.ӲZ\->)yxjn{wV nXz_o(׫5 MkڜOIX-*7^Ks['X},sv&`eəmUVkbf4K9eN5X?|n.yi\m]=*X[r&`L*j&<~?X")\bI[;% !k>Xg |W?1[շ' xO|,`%}`.?ضV3I9%rL2yy&` eLZ^%fK^m˾YnXnJ#my&`ky\%A-7CزWiyU*'bQi.q Wξ(v|<\p_֧wau?-k]h+) 95 7KW|շE"JQm1WHgs_~xf8雱S}u'o7ӿল:\'Xs8z&`%8+aЏ8KsiYf㽙 XV`-a yXEe&`=aWP{q֩b X(q UF ;pL"\€ Cck lۉ%f7{-1 `wCYc*>?,Z]U_|?zM#\σ45mGV8n~J}V(,3d-XRhc~NZ^ao6ldoap`v`ue=n&J-1g\9vx*?g;b׸7y& ~Ig|p~La^DDN).:k*F{ƿå.d ؈>+\te99헿ϾK7AޝKuu~vۨ]v{EPS-%Zo}|0PUo!/,_Q!e $[72_g !|]q3MY)7@&~^!.pws4[uRZEs/4ѷuږdIm;ض\]_=҈{kv;z F ^V=Wn"Ze봡h" Xcߎr^IEJ4aQWjb6ĘUZ-$JTٻ8+ 5`^uEddSb')^)9jUVB"]"#ΩʓKͫY:뽶ӞuV򭫊PfDO,63iXY3Za@f\j&o˖ZCB:Vql#j&kk۫N=hS#,Z m1Zzn 9(\,\j6JCS5 kLl(4Tseb)1&3.cfp2'.*KNI9_>$gzܜVgNYW_P6JI+-swHxD%cM":IFܥ/W 3S~w!p1ئ{ysv\.Qu >:D! OfC3i˷msqJ{x*6-Tzz$Y2.LftEJ|j,-hZLM%5j(%%銫!s2J3 <5~;OFⶡYIf)0QQ ~XoSjEPd^Z) JH$j%~IX(pJfXJ\=H@,X'k],ұ*>]0dTW)鞺50sSȘgEʥQIky}u7(!Q!SӐ׆R6*yˌwh|o\$dʽ6f^)ָؽt@>)p$nlg Ee^ R*!VeWe(q$sluXBs+]鉉8Qֽfa\Zl^[:vk]=^16uxTL.+(JYjXkYE(׎]S`P|8+RO6")ư׾qㄼKjmWHՄkRTL l`W  \|tkpB(׀RߔN iS`Nj+(_X,8I(&` +hzbWj`o]\Q?1d a! ׂH`ќԚhwD\*:ki]Э9 : syd]$1L;*r@T|B|)Q )Pu>A% y_0VdcD8+4ho@b83+Fti7 ژQgQ T27+QWa6uj^YHIAR׭TNgT0[QQi LyQt6{m&4oG9An eD@BCk>84i@g9>R_NnގbF\1Q,8(ƄU(6P;L'!eo)`lbg7`,h%-m]-m/B&6=Dd-$\| xK !Ձ#/#X~]lAJrdJUAİ #;^ qΈ i9J#HV 2o)HJ5o= %$de@YN[[dQx (nGep6 O{eS Uߨ}^iyǪ5YK!VA;>Dhv|yP'KS}ELl2fbM=%DA>ج{9І(Tʗ蠻%|̡PB0=AN6PE`췧;g}JPmIJ w ԁ]2b ;kXH[W qroَվgbh߀$"7e#kvq%|cjc:{M 35VEɍJL PzP*ZE"7 תџ a^iXYwcF8gTW(!c(Z* 槮 MOrt#liPdFj2+I iTJR,ުj{9BExgZFB۫mU٨I?GjTLɃ!6%ȍڪT3D֣ʕT*dC:C. 34# )w>[D dcWY|oow`E]1("NSm:X&f@᚛r%7y/˪@,O(YT1"@Zʍ:JH,Խr>"[`pOUʦTu19fwܬhHk+M3| 4yfҼNd@&ci]XsOH6'GY-!z*MFn&yƕ jlj!J)#<VP Vr&@#-&΀zre2\HMO38%7aFFг*^ ir8,Cɵa+$[z;D…Pq5fSMXi]@C, cViւMɥCP$9Ħe]0p$@BFru$OW,"jp!GR T ^y_߭fp7@ǎ)S8ؐA2$X!N~Tc8&qrpnzO[w}ɻ&~8?}l-!\^ H揧~ɥEc{kw@y7WpmDϷmq.*T76r^,V<_Ojۼq{5W|ӛ˗Z̵_Eh"Ѱ|Օ|S=MX=+YpyYvy:FsPJKCCP9z?Prw!?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C}~(=_WZkz?P^E5P5P5P5P5P5P5P5P5P5P5P5P5P5P5P5P5P5P5P5P5P5P5Pm YwA3C%PQ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?C ?Էzl0`͟WG?˂[9d[ nlX}츾7'N/m3!6oj-~>[ȭeZF؆JBLKV<2/,Aƥdփ̺LKI]Q~6 ` BjJn!`&; /y)1ˊB:YJz:6 "r[n!`V5ZrgZk%`u~!`e_J:mY-% -%uM~1,jv|ؘ;`;rL*7n޹WWggt* y_Z,4]Ȑḱ`,> ku/,ZX@.eyq-%fіkfG~sN/>szX#BZv!`ɰq!`8B*/,R_ X&~E -<%5q!`1;Xqu륄YY=},+;xi+tƺiWŵ0'p59Oe#㿬_M˳vy'ک=lU8n,F-fzkbFkϟ!VwK/%ڲ?m,/edj!` 6 .g6Zč~P#X~!դ/~ k!|qwDlw F-bx!`bF6Z/*‡`nGM& /*.R€(*ZcB:V,%f=k`Y;ڥldS폯w*}BϮ\K:ȆVjK06L`ߟ]H\/MGkm9MWϋ 0>ۓ fя>Lo6ڶ_{yv^/''7/՛\/5? )ٸhjO)@>P}G./w=|;e[hdEE ~>:h+^>>?E?:ljBlxܡ77yX:5|w"V@ť 4oy}?> xp<L_f_<dCCIw 冪)3QV$-?~lNݾԓնnWㅩs5^xU= n|a'nii㭛{?[sncȹw"mʛ;e|ObNKۀynۼ]|~ҙ fSz:=]%|wJ..f}yɽw6:G>Nn]#S9'w;tGDB~oX|tߺ0Wd=LD fm^mf!?\.2sH_!fO90kx94֌'=,KjU%ttVdĉ* O'"W1iac_ Wzl8H_5{b#ϙg|bW\?Vx}KQZqÆ{a%hC2Ft`[R*S6'=9>JpfL3L#w=&ʶdm8'Ojd-PzSAms8Zk;z0M}`|;PTb5iz8#\G Xf,jcL7 FXьdK8PL1:bdK4m9Nge֒}\ TM>gO vkXX*(&kM&H+H29cO"<½qB,=$fe2 9Y1ksszK)VK1!kO@D[3er_p5Fx˶.o3F?ESfQayD㊳N?-? !jUK3򌅎7{7LS#%csQD U7&ͿW߲I1f]cw*cR#u,kjS"FysS]l4N Vƌy Vl62f8c[ :>;H-Onw3r. O%̭=P 9|_y4Sm2ZKv1:ɄڐPR"D"Q;=EKBFS^IGMMO6Xk]l2d)$۰4aX)Y=7ksKȘ5E&iȄhyv:heX3ҫfuDˊk=:1LȔeI\س;<쀷@>1x-ė&]cq$vAButCrR%$%޶I0 91؈G˺k6Dž%e =Y{mFǡթA0\m,(Jl@5}jHEh21pPTxW@Y\ۜˆERiQg넼WHU[2Ԡ\Nl&  BHIy(g΢J0+Z &uafBŢsb)ޏ`r;6; .V }Wv&$g,7-'@4, A 5ɑ̿-5gV8*Z]Dƒ=R\8v08å[]J3KJ3d:ͰMbX|LG˓]@Mb9]Ţd,J#HԆ&5*keh$F*]Kh< {T j,W]eMxn}Gm!8Ɓ/}N,p -wl<&{kl4BNGou!v{컄:ن)Sm6dUW+|mHe zܥntfc}صAĖ.H_14q GY䉾ALGSCX :ZѮc;v@B 2b - 9WL[ʼgbuD8+1z|!9\$O@2{~{oFq]L/dM鮆e;?=hF#'ho:" :zѝXt.jRvb>VJf4cv1Z<Niz|F;,s)&k$,R:VxhmJw壷n^"ba-5f%g H<Go9r v Z`[ѐ. +4#9<XF׷oՓf% ݍ +E$XÖ>y8X'c'F&o QqY 0zD'I[cdT1(iFr6oW`Ul5?=w6 1|E#ocaVCNM>dXV=<ɷdݐLFdn7\j蟐}-nlu~{PiRvr oܙ 1걪95F o501z7%H;Z .Gπz1 =s!sS"qVdKFdkj 9 8C}`;$9[37NmSk̖^jUuvi" ZmDY7ɮBGRC|N]hE [n,-|lhyH_A{G=PsH7ªxKBx=oޟŭ77wg|eP'T.)|| N ~9GN㒢)eʭ"dZ==~~̸x}g)]ܴ#Mj+9Q[zqR.2}=~>~}ؗyc7oV^_ArϾףEx ׶a%P#}=i "b ѥeb.߻U&DlYt< grX[Drf.FJq>&Ϋ` X9Ǵؐ eؘLΫٔ\DlN\b c+;}u3HD{nw.=3迸}uy'O8J7//vɓˋ,My.X/ER ޯ"Sa!EWzؔ؜;sN^lәSbmvΚEĺ7!gn0[eqy!3 }ؘ)UV6e? 燍  yP9k}Wo_E9J>䏧Ȍޤs?-7?G%>Br[b=lW4̋9ҙ=b]L9yJRZ%f$sX,uGPZD,9>Ų ̇O\t!/"6ʩord?&@&_D<wdN 2ޝu~Xe/EZD|z6$D; UO~?M GOa|s𝰯{qjs:ǩdvzv)_s=6Na)j ;}h5Zr 7 զ):]v®Su Na):]v®Su Na):]v®Su Na):]v®Su Na):]v®Su Na6S= }Q\ ԑcM~Pr~³u6gSP;u`\TPGAuQPGAuQPGAuQPGAuQPGAuQPGAuQPGAuQPGAuQPGAuQPGAuQPGAuQPGAg 3g:u&@{uf |GN4&fuQPGAuQPGAuQPGAuQPGAuQPGAuQPGAuQPGAuQPGAuQPGAuQPGAuQPGAuyw^ ԉd} kWvx: (: (: (: (: (: (: (: (: (: (: (u>=q񗟮$~bw)r3AG^?xN8^Ýū+r/p9?7b`$q%9` c_ >ZrlawݚT 0JCsaUZGj"`CPP ; ' 21j"`uBO 6b'TJpf .@L,4qSIw2&XVe:c&(XXB˳{eE+ ^s*`.DT @?*X#X}_K|GjVG?lqwq{ ^Eg,/[D\+uxw""KZ+Z{所mMdҳ^wzqA7㲁9o)պ?6F{ X%dܩ=<9cEM,TxG({!%5%xo^]s׳x{i˧;L|lw=Z3ʋ{`kѰwqk~oP sk$RFFT"e/(2f|j_ q'm`2MĉՈvJ>?OJ8bZr8evo.;KX;:5ޜ뫲x,ڇ-> Q2kJUࢼ]0O_e_.N . v?eÉ|e횵;[GwGC >}{N(Ѷ9ah솑_-؆#ۈ#u2iђO W4o\EĹXzF^gϤ at`[wgwV&xd;;XO8Cקrp,ӎ`MOBӎU/'3+QImu}sWGw} ZT^^}Kc&,&k(&1ccF` cOCp.nm^}Uju,39mHNQ}_wj?7j+;荞Qg(CMv$3Lfx/fx?-╩ z=?J TSs5[hw0y3[2Oh9ie>U}~{6&.:W̿̈́oZ ˝:wK/,+G*Pm6S=~R/S1l+7-Zw"vbsV۳LHIJh΅SS{&65v;j#X>QMT5;`aJX؞}lЏoLyfF#SHmm?dGvn;sv_sۃ GKvn+oL {a"`Vۗ27}Q`&Яf<]$Jd$ wF(O2,RvI(_W fFv뻃5<뾻l<`??$Wo97loUa!xX%eɛnysI{x%o6[o;ЮoW-]g?v}@w76hhf_T|wQz7Sr~N 8fuUkAekk{p1v܇Ixw)he꧇-1z v |˳zdm[=,7{Ǥ3t2|)ˁ^\mϯr6Wb^>Cn[቙6g_,!!E,/unv&X~^>YJoǒusns:jggȔņgZoܳykHU}&!^K*-,>a~E 8 Ewm4웏]Y) =W 1@Csa5haߪoKd]LҒB%>U,G-YGK8&">qf_ 6yG1-TMJ[rMNٚJXGŖ4Hż.%`RL}4PlE6bJbM Mͩ"1oK ufcH;jVr)jSsGohbSIjhs$tDKke–f dzs;oZC@ېR:vmpORdbM :F0T`ߥ[ 8>劁FV5ZEr6 g+\D!}-ޟ;+I"Roc6Yyc&EK[@!!X2SZ|uހOb ݳ*kjj$ks<*JI`U&kAh'@}6:t47n Rֈc} qu~ mĀl\CU2X %$ڨ(C)OrR  JP"h{кm֓*.zчdTSR}h-6JZ!sc`DX*HtA{V+ JAuTX5,xsvTFEzpeH`z& <$Xr EA)V TTPtB[<V>jvbl((J"Z*ʒ7[S2 :9 & |} ʮLJ. S 0XVf6VdC!Z *7aSѝD] QQ gIЗ0L 2|] ()%Rg,/ݕ$b.QvTuAd"F_\%T0KէU DQ( :JDFBYsHU vPM0:/Xku^$!hⰐ8l |/h7g' cEe թܥIk!pB@`ߗn6]/QJzdcw ڦB&a5BWA+G[БJFo{RKw20:xA7 ,,tB\A-)BJP$RDM+k2 YP^0Xr@0/> ݗLd:Znm›!q*@@8^ Wd!*T?Q<{SbyǪqm,9|Ÿ́ QHk6ٽ=<^o,H]C2b@ ٕ;h90F,6v-o0H,Iϱ[7  tJ3%VY$9DH?<B=;)pX8*üO`AW8KB [ډPk.+hV҃,vMk$k4,.ucXP)BMsVTl)(40Z/Vp5U ~cK-K@s4 a3tzy1?/6b~uY7+ݙDHb` ҍE,4zT66\`/M ;eVsEl6-GQZth 4󈞿ʌТ E>DA$C*C9AbpsT=$X:W![QBǠ ::6#sa@wmZ&``vzػE7%nٵ-lˉ_HvL+Ssc:*P'.3ĩ{&^`MSz@H K9Oஉo,'~נdT4,| o1:3AnC@P 5H\ hx z`%<[Aht`4z+ur ӡ%(k왂ֈГHQ i(yP)|x  F + 㴱|h\*Uزhnd ADbusVh`7*ZQ4﨓\G0K$d'uWH=ƫ\\3ϪT*W&8QDP-xUwR"̰AH4YX*O21.۷o' RT v-=ryBF7͠jyE_|6}9_tH|Yq9x3.fPl6)./.p*_*i&7gv2X/[^d{;[!N}6/|N .Q!ꀸR Q' 'ꀔDΏHԡ!G7u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"QuG=%NBu7%uX4u~DD"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u"Q'u^-Q+S"@2=XX QG$: b#u$G$DN$DN$DN$DN$DN$DN$DN$DN$DN$DN$DN$DN$DN$DN$DN$DN$DN$DN$DN$DN$DN$DN$Dш:2!{$Lg Xff{lwEY'q){+Uyk?(;8#-Wg#-Ulijc?Pks~C(wvҕL}6v6.{p~׾LƓyIrѿ?Fe(X"OLʾ0O-%D\beS9*HY|($Qɛ42-59 ')h4Š_rd~;`7r?]˿VK./44x>>(s;w>MgŊ߭M@Nq>|V~Ƨn"b8)R(iH͡]3Qch_;h2ҹG*'8ԢJȾ>kD[,?=ZCVޛ׮:4A8T峯EQ5nq1^X ޠ ٟ!t2ʞX S7 &Ms,q5[rI{F.Fnt7~VxTk If(,e ]~u˸|gQr:wft0V0}W_ͅhpqK7 vjP66٥]dRZ%#0>m[ǭh,?Ea%D/` 8yP 'gX4NehqwYH*K! GJ ΜM)cg{Щ=Xyer<6-G#`L1(K}(IRQdL5*KЩeJc'^fЊ<6Q0CZ M𲋏I#<t??'"" 9]sb. \!;9:8nEUwݦV0&6"-C8iF4 .&6ȝ%DoX1BEbŞ\kiӓjvl>‹hG`|v1m:O(/uT ٳۡMgr.)2g=M>^ pLAVpF(*s[=ڂͫ PNm0 Pjvj'Rm'Gw0.%<0vP=yyk0١t-8>3n!$r]d }1&\L cG,Эb!D& /52pn^sO\r۰A.= hջwYnq@`n>~n Ϲ@oW .Ե䡡5?VYeT?+:ϫZYK JYݘ ?,JߍG[٦ކozYɃԦec֚|=.f fGUgv>-w͟QCBGeoNO/('\ )nVa֫Eg=zf(=8,XЖv C#:]GKNqdD9PCC;q\9;]^ [;V 0#/{6ef֦6Omk\'4\x0(g zM!F'?y\]880i.%h9*|8J;कLА`F^L+]Ӏ,u;fFkS;˹ysxEzJC{r!j@uK|̓Ydh_P؀c$9#6NF[qNuRo 4g[$ PcQK7` ZIU/>1c*cQ$CRR7cM1eXT_G8yxYui> l4z6gGżhdO߶U< &NnQqnE;h's}dw2|c9qkoG%8âUtviUz<)?xԚxQ؎ck.ͯՔ5t\eqsdF9KN^ɋ;yq'/ŝwЎ;yq'/wP1) K9$ksY ɾ%xt#Em,RAw?:1 4"͈3϶ 0ÀgiJMğ!QY?o|z 'l8߯@yILG%BF|1 BjsnKnIRcĤ3/y~xpHe܈b9@V]K[hNR#W+ѿ~K b,F.B6u no᧹͑ӅϜK;`pUՈEMSE2TSmm-to3QogPWv2 %"cրʄiOA1zhs=ox[u՚m5zmWsn~|PwAU ?h%' Vv}zre,Y'sHJ 8G 5+^z/6OMk אD{D(+Q挒41.T潆jZ"عNw$'0;0B6`]Rіo,Uޢ[rq8XMY0R"bO> _0.RۼPj~8>XPm]j LeLiL&&qTLx_-Ҍ2䉕IbWg-敆6MЃij+Ur9tѨ|MZj[fKa2ǩU RXyA# vŻꝫT_ )$3|yD5S9S\qrk1d-|Y0ZɄ7D 4A?{϶Ȏܯ-@8A ا, OduF#X猁ui.-UݭήLu"ƺhEіx'+˙bI;ҲƎ \ Grg+Fxs$P\*S2ހ$e4Ņ 3Cz2ҲF#p4]wxO}wNN bhW*Ɉ`Is 17HKju]<8;enha4^v䵱>!-u;d WZSҲF}8ޏzWo;l(NIHv9]&9G($0ڑVU_V|yl2D*.IP2g|qߎȌAfx_2hAȅ"Rp"c*d\fV'KZ֨J͠ 'ـ]ڃ'B{VH EfuQ}*ĔLUw䘠AЯdgɵʴ6 _7 P}l¿oiJ{{a{͑$* )AwV ^^vn; '%a[ lYcn|AE/|6ls-W0˳2JCײƄSyAw)J ㆋu8ES\۠!D$cqCZ'">j&{Koge:~bUbFgX_X68{:ie3iޗj& %u-0erFc/s(8"p&Y!mww Uɨ2+U1&Yo<8|Nu2V2GMɀKT+$qHWAO&QfX8_:h wgܡrw[3&먛CYgt`DŽR6TI왦m,O)j;ڙ`E\O8Ccaʌk6| !h1naq;AW :^}:h5Ām_ߛGՁ=ʫW̕\M]+8) T{~e/E]i4ab x!]E9"p,Esl1n6n-];gw|JA!S0 Yyo>$ 32 ](M4;1+Jᖔ^gUg/Gܛ?/oP*\~]V= Fp=t籭@\d־2'jxm=uxr2rm37b~^]#ܫ7XbK }މ|Z4V{})ڦ:ԏ6gwk m2-) 4O)7%)wt8} 1l1ܖ5j.۸xKi d<(!U{KAcTTPähe]d'@Ƶ3QJو CUleXdw ư7aƌ0FrD)DX%Z .7ec(p<ܬ{&Yc2ĚyˍΨ\팉Kjeo*p+o-=ApOD U%*2E[{s|Eފ[1'mpntF({ujnEǭu{5붆r^M0u[oV:<~}μGQ'}7[.b֫^6nr-zp+_|s;ػ6_$ )]GE`H2j}Ma,C:7`L w̙nؖ_j~unjϯ0&X`_. +$Hj.Icџx70lT'QseN&`zHz5#QXksLt"Jt& 9 '}X7z"{`,G7۸ GG~@]w][2*L@2I3a\K"e:p6Y1m b/_tXctF^\869ӨIJѫ4UF 1۸ Gc>"6=p<lxEd1lĸh{U'P.TU&ɸO'Bq@⢈qQ 0RnqKice]=F=[\MNXK&YKG߆yvN02- C '7ĂAH J(K®`Jme:pnp+5)R6f6 /ٴ(ˎ| ${;M^kbq '"-kH 9/pc^'Kq[8rD*gf*&y;jcP4Km7x8sy뱊8zgkt%)J*ˎ'GV7I|{R ьyx#,jDvm@VMNLZ֨&GSa\Fr<0*CitK8!vz`鑙c*Ӝ] -p¥JYd|lՆKf&@[mQ*(N ˟j5'~>1-W@Tɢ,qΔWǕA/Qw+Ehv+E%8AXW*'1}Z,J*`$) 8jT![8t!Eɖ2FNםv_7;y@P1C j]WvGQvMs ߗ2? h,H $ $Ϊdtvޜ6Tۮ#9G8$gLLƴ82-05E0 /Q5|ZOBE\?ho"U6uscfscxOzeŻ=&[='Rӊ8)Bȃ4vou3vKfAd4yn K7S ^*s2flޗZMOboMpe{!/mAm+˄|\Uv6G6 7G,Hj57n$\\C@yVqINp\E q(i, !GG%6>rUEfyByo&WP_PE_Z'u2 S"ƺ7t Eiw ȹ0& .|~-@5 俔i&E啉Yǹ*Uq}j|ZrV+lk=as! /eR@ bHMѳL"_$6<ԤKMY"[' q!S/XAIb .AȦMK\5`Ffb nKz/yu|Fvso*'e\OZIr{Z]~DhC܎0s?x^Z֘CM~$Q -kL9_r} 8ԋg|FeE#`lw){:`S[֘ppn I|^U}jQʤe}u)$08s2즺&r5&\.QƢz[;e"p I hD'撌E/*P[֨"GS!'*z(C{zAN5LN۷ק.V5"eU}ǁUKˤbej&}Y~×hZ7*J1Xq 2HT"4O &.!#ۦrD/sC֢owW1ݩ`A620ī`HJyw:|KjL|~,NYKM8+Er*>3"m)(AUI#Ee*8IX֝rF|2M.>yb%918v\*O'.~7л2z2 0h[Iդ,PbNox22'9n 9w"{i xG3^"t{)lɸENA՟T'z܉w=ȸ0Kl4ҸȗtW*_wGP$EƐ#vom@/7.خh:|L%wYqH.jUzozAnt6HWkԹPgJ^!v<݌ עl f"2_8LzBhqEe>Yt<-2|(*^Pc2lm{UGT~x`j#;vM<28Y\S}f' ME f(j9]=;H rI6 ѽ3[nt*zQ}ס Wl1 ߂:X~ظ>Zsn$NOmfD)D&_ 8`ɍsF B86 K豕lWo M][ndIa9C%I Pmp_-|Χ|u`uvlP8BcqVZ t/r^. F_) cO%C/rZrS0J_ݥe&չt+gɽ"ʓk۩EͩE K[Al=%,x~30(8ٮ)U^ܯ`.G3 "tuIp!g-2~&Xs*3qzpCQP zz[&_8; II(=aBp!s`6-*WpߚՀO7j{'e9| .-k@g=g.QS,4wr44{n冊t9Riam:ڙKO*=qm!nf/d*7~ۧ Q نz}w$q)kI;KY>Udu#$x4=I K\˄]j3Zd HCpCI回$ 0T{AHˑZ^=zЉI(Ŋ UNyɸrڪJcBNub|"tk9q*Jq6qȲ8d>dBHc>֏#zeQMȀ[ q>.6^r{גtAm_;UAmpM'q6սqp. 8W?0<"a6.76z\%"UFTU(V2n*mo$k 6~eYkk,w<3]ޔv\ٛvz,2.Qה0|$Zz$9NolxL gwQS)au&A4U IDoj#""Q0ӱ;vH&o(" Dve𜶯a˨4*&3ahkC4ӂ$S?qis.FF>Ͽvu,hx0%wUʘ6d xZe6Omh1{'K5\r?Nek#(4=^c ~v{M.ke\[:NJ:wxO|UѤ;❕O KZKb2:॒*Bjsnm\p-&x*mKRn*CWֶ8+"40[?a+-A GLȨb4Xcw:2-^#ߚSzU~+43b> *g!/Y{#!gEJ:Y}k?ֹx:;|2  B`봨ˠ97[LA}yLÝ7I'69L2uSFM0X?XH LRaM7Eg ZW˰2pVuᝍ=٨>ۙZxy:C>;ݭ.:\NAQ8 /h-_杵ڎViB]/8zRy~<A&W"cyi:Cxg=!sn(Ǜ#4qZ?-X "qH\_-CQ$TFtRAܪ-f"U+B/C؎w?aSY`&bFOE֡2hA|Q M"32EAl}!DOqxgevCk3L_ei0ШZ1_Gƅqk,}З91a}0٬:h.PԆLD^@sL ˲U˄Gi"Xznl3s2mMz#^ 9QЯ /3NP 1C&9lꐉ )&)+hKh'.;'O5ʼ)IKZ2;]*ZQ~ \O&hsvЇT s['M PQ ta~UҡYO J {\{GCw1_|р Bv-~}+41h_e:Z<%,} /<6Oud1s׈j,7hep eKݷ{NSE0FcHgC#ۍ)Zh8+5y[DAPDW˸ ,iAUVpֶXҭQآ"nCpб9 j`-;kqVbPD:4Po:sEF"o 3W˸Δ#'$ԲnDNBsފ` 4qh[ɯm|j- 괾99Sk Ҷ[;@r-L9 ~&??W4vhBsO-{Y:vyvjEqGQ;% !nCn'5R"*JbC mCJJFuzo^mp"nUM 1QJ*G+ 6يx Tpm@y$b;\;]yw]*V̒K*LDH+By\]wVvk |@ۀډ.XA"Ώ9Npn-YqIlS`dE"o !jq DF+~ZMGRx>ݦB9a# vE D䯖C c}gߌ~-|[qݩכgX?Vs~8i~Y}T$V{6&E=UU5z'X)%8d*xd.REU=-xZRo.jy?\5nݒӂ~|MW/{D-I lTLǔ3O@N(u'ݪї a@l;S Y*HKCqLp Ř6Boq6CM t6+b/L+<<S{yH#\wnvI˸ LAy\¢zvsOȊ{_'ۄ:$Q8F%E򹯝U:pb|,~;4!n5_!JXP\0&1wHW9='A`oWy]y(OMFGFR>EWLMY_0 K3˖.*ZA}eZ|2+/w7J[d|}ͤ)̤Cg:0$f4Ӫp;MN5D|& S]=ca3 ws6#`"a(C$e:-9Ӡ2Zkb6)wSN'X_e[n!׃2[0 L|54h7 Jxm+$/݃>/l>ڲЈGdvnx4h4!F95m9jIpxC}? zăG>1r -.6?kD^,6&V^(s2 ~8:u]QoZk$qD5K8/8}#.qzA* t7'5l$rD/"!put3V o.@NB_I<7ڴ 'FKO,WvQ@ ڜzkuɴM<]ҩ۞B%c N?Hg'Xzo?)"hWD!~充 LBPJ NC@Zk)幱ʅp""٭PC ޸E],2a$H J(VL# I!zKTzV? ;^rc@ڮNZ7ZTQw |G\S?O/*ZT\?؛WP\[˽kgXɨ!=Ѵ/Hi\*.$N̑zS_%x^^ ve\'2O͏g.Cdw?FLlsXQ 0]ELQ!@VJ <{ZQo:L:LC:nkɪ%UV19K%c:gBۑUƄ!+k0+ y;D!PDW˸ Ny?M |-i2>4xhQΤDcZF"I~l|dršnh4cI3c;~դHGdkf/VUUŪ_ m@ Y/ / kNHS{62 l6U\uWI/ofqS]\x5k$UL-^#•߽١W~0 n?V.LVDԑBTRDφҿ,Y_!.fQqCO7??] ~ ^6\9qxrt/|~rϗ7?MwdvH?׍c?g˸$M4y>NDbOֹ~N~ݟoV[+>^=v4XeSǑ4>8F˷}p8KÔ ޱ( d)Y`leZ U>O=?a;IwiqW zgehF1aVt ?W^'[IfW.dvS&FXN7̃_NBRK8ܓ_~YXw.I5yi X^C)# RC0|v/;5DJjMJ2``R+hˊ +?êZf& acSi\"Ne2'p;(B ]J+sPa`2'iͧ$%% 49$28pJ|;Z0w/RI%d~Sn}P(gEz'TP$..B8='0uRӧӲ1DTf|b- hzc\gg+@-ZS_WusRLS:MLIw˛w'R\w1,I*ˇW.F1&lvthUkҼoZ^QUSdes_}z`BK*JHVs %#XIIEE˶}Ñ<-w]9%K_~ʰ=+lt=L}eJ}C9~7c\^sP)pXR5|:A瑗^+y ro A}ۄU=pQVX>K+ב]<#3 Jn&_Ӓښ:;{|tӸP߁JnofrD]̣aٷ#Qì7bIEZsR<͗x^2+ooީQkaoܥ}coM¨N< 7_}NLppj!  8å/I!ǧ!)`+cvx heҪ_6{Fh;3 dW5Qnnz?ެ N|K΅&ԀUra)NgN$1r O a]@DQzLJ=#?@,>.;跓 :̘-}]4B| 嵃ǯ+SΉ\H0 әq*MƆOM&7RU(NQ@ q7E Aq>b)-rv-_~#~e9<c⯤VΥ*~Dм>yRcF*cnl/cE`i8TA)Z/hc-/$ "C(\EXQ(%W* 47Я~;W`lЦ%60A8<{{(3t6Y>pQ?S FFY-L|:\;8"6Fdocԭ}⡽}sUOQyI "ES҂p;F 0_;J3x3o6CRoC- vkFol7ݚi\ l Ql܃v>յwom6o<}OzbH )ᣟ|P~qW%+cxDhI_~~!S akɪm >co >3aRcΖcKa1xE^5wW8d©γۻ-NKJ wUi45$<͞ylٍ5O^qU<9@΃>RD ָpy-ou\õbOWk@NڒWfYܕۘ))U@$)-H!M[T(!-# y#:]8^fjZØDO,lRE}rn|puGI(_yiP±; !e0HR6BƔ`!H&<bxF2(a.H4p%U(Ad1 F0uYEpbJYpG|,QyZ0s\Y8Ld%oUhHX) V01S4hGP|L<).)Np56nfUwgAsWrli1+ݳud.ĖG-UT1@Qcp; t)z%)uLJrGNGRD0~.ce,-xm#@y8c' BQ_{ F <fy,3"-U"XIJoDi)-MaNUw_bƋeY*#: y !'|e=:;i\MUڝfb wi06Nb+m~f~.]4”ڋo^cw *,cd ε_\tZJv/).S),㪊E R) Ȳ/}*;X+U*b)'mUp / 55ps4u\/n(y+E`zn{EӬ,.1Q#D͜+ZNVѦRL%F̚"Hlm"GWe:38r>a cdXu(/ubf*#Hx(tJFR"Z0}a8AҎ1 iF} XЌU<9;Ft 64:an NwyvUv*.szDF<"ӂe|'SmlAUKV.4 YZ+fJ&q#˅]&Fj0hH`X,UdrR8Ցe.j^b(3aܡM=#Z=|YTq#EYz#ʸʒh}l=*f‰6alcd(9B-{ÿ-T=<8cdГ+T± ǮªGcd83@D<'"1p~|)?<,F;xzepQM@P%`GnЄvhBp d|Fь^oN`lFG3 dt%3jUy7_,Z앚ǧq*.3٢gbo"lAa.f$:R,Hf$fTk_1)Mb]b.=^Ta\:QUaԻ+.ȵÿ͆Zluܥ[̤.5i`Kf% >#xٱK"X* 0EX 3*R(HɥApXm67 E)3k6Ӹu4xw (}m<@pv`w=gpM -_.e~:,*"~H'i{9Ϡ \+Ơݽp%vߜ]?nv$55YR8YbnJYrB#Wa~5f. P;Zy~mqAJGn-|v];%.ǯ;N[O_t]{OFr_i .v꺫 v 0 2ΗN6Er-v0}_uhQMdM BݯxGիWjj)A41*%)E8Y^7KMFkMpvpx3c)X7yJ,xKQ R&,y٢B/U֪Lph8hN+%:J9 8 =y_Y.Fg*&]10I\CU\&>MiEW_\M7.孙_G2 { qB`ၫ(mW+v<3zծ\x: *P}Ou՟~B "U&?깥 'c = hn:.+_ےɉ\o/rIm^VLQQot)ə7?>׍IyS`C _(܌4R k;_WL۷o7nMufx q*E+X(IE oB8\>:an < ?RytR0s&Р ipu~I7E?NhlԆ7붅x՛TG'@@%ADY|Of\'#U|v~Q}U&nRku~`iZБML*7}.:\X@q<3ҁf2*< ϫЦΠVtDs}3'.X#Yq]4𳒒'bfS8_{VJ旦]^Uޘi^o0-{N22RY8"`;cƌLb齖豉hj4BZ"ViWYHDLoGv-wc=cY.p4%pI3t~A]XDWA&5T&.h.7޷yK߂n9䯓qz~ĊWQ\೫iU k TWjom?+& רM8CJҒPNbXXr؋f_g6\lݹ.VeKݎB.ՒAy,w{5d>+Pn@╏S))ěU`̓>q{oes8KWtFVߤH!/zr,{6r1׀PE#Kf#P'%6yú@tY$bt$LEt!N}:pe! \H☁ )f:$$ X/hA {`b9Yƾ! : $H[XA`IrY$53Gc-F˳%,i-a+.SՋ0y΀ ;- S d]Wۋor߸Tȸ^&5،(b_k_Guthj@8lu34tU׳%&U ïd=.Az`v>N0bM.ܼxW^MFg c?xu^`'0ّkͧaS-.bˋ&wKD;{yi'rıfa̛GSVK:W8UpH?}?w zQ@_|^EyC5 'QJmJ&6>"Rjmw1ijiI@vX^(T zT'6j4. !Jh_%WӗӢ(А ^ju5kn=_$@sߊh\z,)lrld׼\n(ՍMǧtR=- rī_vn6S88 5rf2) kBD'i}^|i(CVȢك"4HMx+A)k jc(%!;l- f.*d!NRC:dkKߎmބR=P:G>0qRǠ"nU>fS\0s 8~@nY" v#-9CNz&Af8' Uq7`\ݠsc` w &ୂ1#1v;l=DԂ}0͚"fJ~peˋ>GѬNN20O)x¯:竜NW]=$L9NUOome˨6|5NG"/ b &NYD"jBG*~:3O79ڪ<1[aܢeqP[ w Y!C̍h6RB9e/#d;sxsz I:P0v%^R Y(1[,} *f=1hY29 S|GQ~M⾹ S7OԥRIQ\ZLUT:2ȋ_ ·LQ􏒾''ovɟ;c'slDtVsaQD6 _W'4ۖP߾ٞ)BX3먱{_>nM-at-,$X12QG&0O1iq}ݐ|z7FLZxD"ev}w 4ISrYcM7pIJ51 aQc#Eٸ [eUss7v $l8f"l 'Ç#l8n|s}D6Ѓ[_q!gPz4O:FbDd o:=_>^G7 hpvz3#P kfșeS\q(:d(X^ϧ+,yc%/%z63xGҵVֹã,{G4ɒ''t9}S0 Gs 3MElM0Px)w9%Y:x x_7;b1)M4>y!H#LJ36\)GL Jrd-Q?7r uY k  > oþu?~9}|)HԔ1тFQ4`KJeL,>~Fޤ.xq1ܾ8z~/ 6r& 1P5%aRjʒQmQ'(Kޕ䅳s`o{nj>|=ai6kL'f2^*QXkꤴv?$+'~ryٕA9*`H:50vC C0 /|~O,?dCRgS~ 4'UMj>1kBQ X`S"Р/6*V6ч_gl4ۋm?.{~ƙwu3Cw+:e~а%8޲H:"Ŕ"YJCJ;jueȆ&{ 8/v,Ky(/ ,o6I89 $36~PtyǴn2J#PwQUs< <1[@KqĊ@d$H*i]omf_ M/U-iaƦXoٳHm&eI78qO=똛=*|rF#fƣ6A8ԇ@.^ >Q夓wܞ&6yO\(pՇGôdiY *MIhʈy R4OǠI@i/NFf޵6r\BKbdSyYǻ$:l0[K(R&[{d^j%mRR2P+C<6J)+aꧬVrF*vaWr8魐c; kwmwiRQtuf QI*z rW+ (W⾹-iu%.Qr\VCn|@\$uHHr0R /3du+SSNYϋrruS2VǣǦvR8eVeLhpy90\׉ԝ$s2\#/ucriژAV-hqn.~"vRj\.~**dn9- V,ZxCd崙OX\ӫ*SV~~l'e=+&̺Oi~G\eOPFG'6rM'ꢍw?5+0I5#ugEw->W7~FfL.Sk=-}/.?턹Х?,89Y jڧdxd6l٬P3NUSNҙku0oB\JSApf^쮭OzյnrG{hl!w7|ښ(I6?jsssmRO㸭}ATVQ4v|'NSa\tuxVYMT'鶞'N=31YJy&5\J2WIؾܺ]NYR6/jzu,18R ӭOȻ0YE'ݡS]7(F|Sk*jٝ >%L`oRGz!0$[3m;*k7.kԄXS`X,{*U UW |"оwKEbփ")oún*cj]Ǟr]6H bؼ~ل9aC~4N~\z}?ŵYh-_sLkRϽp^̹qrQ:k)/@N)}u KHˀn-([8R"O'wzKn<:Q&Q^:" '6ҟFHC; _߇"”sm_'<=pQbx09k5r`Xo\q?}6e{rB0\ ȶ1Z,euh1QJ8C7#K=A6nvH(r4W(Y1 Q>"DB(%mڽ~e\0vLc!ueQEya4 M+`[[UmѪyvQn 㴲hz7wZqUFQ!HpBHch45ka($3{=-U+pkB#C`Jyn,q/TN Ǝì#(h1$V8iDY*&qbߏ DSԓzq@CrL%4Na9IpP h?hZ^?/d78V i ` eߎ.iTǔwD󭃷#Tb]Eqi$ ȶ{I.8{u^`f oRc'WI$q`tﱋLIpV@Ӥ V~}R>j*$@$:kRa(;8Fephyp.c1\niI{mmC' QΊ3ˈ/DI@g֖@r}0($,gJ!gFMp2Ш ,_䃓:C28TG9byexG#cEJ-QޕO7MV!DP@ACƇ`6pJOx '><Ȣ.Y1B xh[ %4M Hd"H`A0&ʺA}+ϙq)P)R"IAj eIP1%Zw'+]nHV0C6r'`u0Cg[*" - J gl˪pȦc?zJeU4`c#sĢAI ЉIPԘm/^#_ɊhPܿ,+C@ {'bs&W)U*ݪFcBz+x NFC?0 WBÛɴze@8.5Ȃ qv h]Ku")`A!0t=Yڕ(5{r2Sne?ii9!L{dpL@EhrU׺2Zub5blRN.GCK;6,]jbzP2S Nq #)B%jKLLB l ,"sx =1 G'gŧ{z}덬@s^V#xʅi{hTGj}8:Ոl1.us1MܒTeT]xڡ*9 HynB4 PШ N%{Q|5}ӡt ,RfRlc+A\ Q%Lɪco2o/G].3{hx1t𨮬lR:40ϳw¡(0r숑Y,u8誠M+ EGxgB%޷Yi{hTGNs5b }~0k.+R 9qY _Y_(z#7.]zYT(4#iĦ ..&j ܇eKĥҍ~p+笾.~|1̊Zl!QpC<: AjB L'#v|vPmy,qɥT"ht!(m*n2]Ζw{C^=Oir5P%kx3nVg+qߡEC7pZI?a.x{ͥe){h .m4^~0X`42A@[Zb1rӠ"Ccm4&Ԧjەȡs'xqG/p@n<6)2O0 K}@{h<ۇI[뫇!1B})~FӍ– OR_#&dT1Zv[% [ElB|٣@gVY!hq.r?[}C&=48H܏WQY{'C? CS}›a̎|Ex7 ZNہ`2[x]/@MqܫFmx!iJ7^hxۦ`Gcqy[(VR( Cǝߢ鷟Q@ڭ>%Xkzj2c9'=8!k_v8Z<x>|`2?&гgO`|,Yc?hgǨ, / pj" < EMjb1S@F 5ir7 41% j Q &tp,OZ pLc{'0 i(qð ?ӄҕ)utFQ_O;uިB#0TTAd^G6smr~Vڜ;9y>'Z`HC\PëZP k8L1K9o$t?'>EVRӟb;s?}L&/?wbyP ӇOY!@)ݱgz:_!}`V/Y'L8;kD&5$ ߧ.R:-V+&뜯>U97_o-L<>,eAZPX*[H Mڅd<,T*|@tSas:$hpVeFʨCeB̔`X#F/Dt`:g\IKT'C Q)(qN;HXfsZQVM9,[JPK(f/F8*PAtBn,MVNwC,vh/y]PNT(' CzU8֋Dsi_(u,u2D;=Aiu9TZͺ\LkRAin0 uE8H*P S-'t>,&[)ӽv[f|e9dQKdfP`"CC 2'0)`ޤRn)N9Ln]Bx5)ƅd@O/)Byf+%ĸ*FL VQ!ECE&=:M DzHZ#'VR% @S44033D PmM)܇hhHZ]ԡ |Rn~1P9s 8GT:p9kJ`$Smw~51}?}\ZeGE߂a)--m.IN Ќ=.M~;/.g_!v%opYUu!_ٓXvl }+^ӂn$RG_8xy"0!w['*H<9?S'w7{}c@>нw<lpѭIo7u#wuݣ?J&ly1oK4>0d)ׄrӊ鴢(џ _y'P3%zɠ0b/V.F٫_MϴonxL'U>ecu1=V[yԟ]-groߌVU.Fi>a4E@I,W[@;*s _ ] Mb5KBbv=I~XVߍ2G]fO6ng=NC`˽2͜H; 5y\PٓcI&Aǜp}|h፿Gxw^4c"${g>Cdr"`*^([VyHTO/)B\;>C2_<$,>9WN)8qܝ; _t}ޣaDb܍Q\/jtUuf>ͰCȳxw8l-*A#owU6 -kz pOݫysM~W#K7n?\~le_6WOiVz0^00F75i 5"qQĢNg7T#.zJoSbC6knґ' #̋3"1jɀ$ZMʞu*ws5K_)Zod!2DqFI-֖͎KӭEwȍ(i$Yί&ee{>0g>V4U/_HW^˓S씗X]/= kD{}qFo=N9e8'x-ki عHI#ÀokDpV-_}7ӥQfkc&x;KR!c/cSN>RMmuk{By `W#M ڋI5eoTQ; :}~Vgo|2[ E5>~?ylj -KLs $Kh^|cA80Zj ARe|WQB4_ӣn5Z)'BB f_]|9Xo 㛝J_-X$HIRMZ FB |EF_-|WdYD:a_8,qoz|^P2͞ TρCL|c^Mf/<$S9Ҋ$ U$=+$/jE NIM%%*jQzWHh_k ^ E!J;M&55ZWB6{)8 _5طȊ@ڔ<9_#x#e8#a|Qq/}'Hh8NsX_B ZX,Q\r<$@% h=qFB ZOo $4WI󵐒XX -[[3T=+$4(6WQ_!xHZktSK 8)  *cZ ӊ|ܷy8B 0nX+D ɀBB R PRl P8k$ftޔ_\1*ٛmM*|(BZ d,>,MYmz 等!KcF DžMnr"BǢJܳ]#xAjv~ -K~+zg_(-ͼ!  ,6,Hhް4[$ cR$@ ,|53!Ua 8 X஀QgwN` OM*v=C*A"˥gHh^\JhHLJ0PYJMI|mn2n [ v h/Dc5#3REB5%tZGWHh{A=AFqw>yn! 5Zz 9,HX%IEʞ#5Z|1kʄY&>H,ͥMJ^!OhBTTHIB.<RB`] -C~ޫH 5_ Jl\aF i=uk$o5kMrBa0VԐ)3 3,Qi{ -;'ZxF*r֚M62v7 ;ɭcO%@dҒ!-QdX}sFB x.1lf+5#5Ҟ9_#xnls%%*%,,&f!' NȹBB BˑQlH%,G+瞬FB sk kJ7H4 k@b)qJa 73)rSl@=*W fn((T,dP3ρ N ?<;*~d~ ܥIJFY4IMgLMJЖƤqSI脔!Se&ؽ+$VlնjU2tmty ڲ)$/bLAA!Y Yl9Hp0P8Ax߄kr 'oH `~`(`HA} <о4D{و"r f!2xJF}bN^#BYS :LFRnxZA<(8`Sj d b(y~J >l' {%K+9+ x]JN3E]ƃ20~tN-{ UݛnyVLUt4H hFd%QMNK&Z*t_݈CU:Ė4iR^QAN%bL.}Vq[CbdnMVp@4ժĘ ܇u2ܽ,- 9<Odn(CԊ+Sl0C-rߨ@p4M 8=^3*P9vntpÔ*~Jج~kysյ^ٿ^]z,p>?W >~$$"/7PScaUt]]cwr{×]ëY7]ro OK:}Ýs-{S$_,_t}jf'fW`\M_'Zz 0oDn-~=9\~|lL{>{?m .f*lڗz^~$qʂֱN ,I`k˙f_dY"o3%~ߋKϕ`FbHI %9.-w Ȁܐ\+}^gX#! ^p5-aj8/pnR/f>J6 vpy}ml l&+ (h0Iz=]IwqP<̕x_l\?jS+ox]S~yro!c1n6Bi˪㪮~ʮ؝q[W`NVC_oF0^  <^ǐlxFUj=?=)/{~v\O?>ۖ~ЮV 7}.~"</཰HeHQ)󬈐K$RX~2ϴp -ag8=| mZqrܮ|yhPGR{aHtGL~J%ۊR=tm(zLNЩY[ʰ ;=^DfaGr50N56 IߔrɃHW㻮2lH¸w%͍X0|v}9̸}CڞCGx&JHRU}\$LR(!YE"X򽇇S;(,`˿@ [hA*-8 Kt?bju(g^?g1웨E\9׽Ņ~xI~Wv!AQZP2qpՖyq뚹ֿzVo_??~g2TT'6.׼s!g03vOiY/~Fn6kb֠$ȟOe&dyŇѷX=T[=Z_/sJ?sy0}wlAkr=PdKWU=tۊ~Ԃ_} Se |s|s ާeL*|HMnJ,v[ZnJ_ZˈbvW9/s=pfg90Yk-|uf⍙bx-o+RJuJM Ǻ{a\HۦqF/_d˙VHA2'Is1h!mqGRIUݱC*d_Š.O<8= Tx!~gD:GetM}n9L+/%5̲ቇ34Ş*[uwhWvt~[CbeET}ϋ*u\0`I j:":ױ=( E-1o"I;ꎧO~-w!"xhM$9~qI#Kz6^O!^|k6&Ps~%8_3"2X*gTNJ 23BA@ x5?3O_, \4<#|^-rӼ<d@k%-XI7e.Vxgaxq*4>h[ԗcy3ϡ^zG;3RV9 !RA0QIjPk !jV *7ʘkO?3|j ^-p<7@(;Q'0+S%}0Ndt#lmsb2tqEƖZYFE KOYTޫ`!8@9 0,F2PRpRE 9 80S>)?aTn_cI3&rH֜fHɀ9)/z7qbv (o]+dѵa?::Nzadaim̗+ BxjA>HJdjE2^W D4}OJ)r`;"gieEYR5~M\6| 3A5!*=Oo?`~~WjMgÈlxzwq0 ;Gb[?D EMRݍ> dZr4V'5uRW5OFѯ4z?C$(%a џb-˅'#nsTF[ }VڱFսşR "@(Qפs8Cv4.M| q,+;/:]JIy**Ұ0zŽ7+(pUJ$mS8UNS0%FS}ȉ|Cϯ2'Ǫ\"/ 7JNeĠҿwp_M3~Ubfy7rVld2WXlug}=fd‰ @n-DO;,g{Oh[)f %𹛫EtyxZ56EO._{ͺCAKNE"a@H4NxlI;RXI>/ɥmŮYO,yށ{/-1~;1Aror'ᮏZ),mq_FKi[CJ!)-!G-9UUYS.ն>@gx:^>CᑻˣW* \Fk7I}}t6jj91%as)=obo~~ξN_ c^;: FsRZW-HطE[Ls:[%ނɐfx$[dd2 TtFfh2܏KD Rd3NKhWZweۋτ.^'xokޙ#bD$$2`[r$5AX`j1Ŗ%ZZb]i!%[A$b)h0S%݀?hxGt/a<޽yxKVAY7Qj_-H`MW/)NNuHYaNuF0 ʄ 98Ac\k<5yԾqD@0KkyF5R5ͫJ1惄eu=e~pn\ͤYK"4_sL%9.L_uɟ^o| UC4d^4zƿS@$XPZ9&X'Iӂ|.)-[˄CxaKrVxOwA(0 ݊r,D/5RDgڥ)SY7/~ Sk4=Ȁ"@`O#|T5(̸aD3b9d8 )dA8ȁAyʸ2ZPHL e v<0u0P,-Q`S L@$_XxxL $TG 3Hj#cA0&YD8V#RXrഁN;%]5)& 略Aw;d2uv?Ȩ}Q)NI%DE3ʂ%Ocۺ]߆~LCf+GzRlol1z[Hb*.Dy[\a"fȧϏpKߔK?SlM}v^KDԂ_^D%2LM |3E cf%y`Uz@J9F4BK4R+AIăndA3-mˀc7%kY+Mdo{&5}n9- 90cz?3c LLq <񂖷S I^)NT]:Hc=0LmSȸe]/e ] R܎ |tM|}w[^ϖߗf_Š}.G{ψJum{h6]vt3`?,֬3ˆ'R0ٲ@enHi,1, 2Eި'U|a*ՔuD t2cWrۍMpp:M8I}|yrP+f`ԄXDafXiXh ^;,@@yONKN}'JY*<0#8CA2+JfCҜ`y9sYSš3µa"=Ikc% HrA7Лfˤ"#1!HS T,ZRSRKO 3BA@ x5?3O_ X._amw bİ/d{|~֐ox67eB)hzg| L3E<S:7%v>/o'U'3#gD nbM% nON5>BGP9 `M9b"@{LCgZ&'au3^P\bS9jZLf걼`5=ECѽQ #s(q p*- kzqaAyƑOFBJ2'!dR*&J8IJzd  Qð L P98p9g>ł|w5X7E5vb,dYJ*?3b%bh0e;Urq/r4 , 3-"{) {,!9!FŰy\ &ŝQb8ǨG-tjFi-!!>ZP 4ĸ3pzZe=l܆ˈq{%"eYZK$=:G/b2K5uw=>8%ޡ貹@;Xޅa#ԏu{k[vy8]Zoulw|4~=;:lėj.L~}:}. ؏O%`6 a6v(!oV"lǚ.=0:#JL*J5l?% IocvoZ ;ݼkm+GE.0RhtcAcyRt;Qd؝,oQI,EI‡92SUU9{ECP 8</*IR> Rq&?*lFߜ)cc%B(\:c`d7<}43q0{@[=g? M<[W?7}܏6nԱnϋݷ 煮~LJ y%x6o3J㧛mGZ'BZzCZĺi CZ3o V:@"`%Sxd1DxIkQ?e 2IoT4Fur'c4vū,F]z«JNν"j8f]`kp..1+{YRj t4!hUqZx"(`J R-ɆJ4i{zw<g;zqUU%-5Hu%>NEj?m;:::"Ox7 (M W6ƨ4 U:1gf[YTРp@ԃe]nU~VUiG#qu9`-f6{9NȮMR2[uy.gw  %d/QxNBrȑ!Z*YJ&G-yKr%4qޗmwaNX}ᄙ%!uu=?C!clPA*yE\8(!N=7 y CI(0\ +b)d,"HPx2$8l)_Os&7rUnwp4('Г9ɲ x-9fJ(wvӏ @4/4ȵożƆZt@RLd`*7}b:^=^uN{0BKyUv7{~Xh7#7m,*\Z^b;|$Jȼ@ Zp0"EmNG֤HNӃP Ay`,rm5NzᳵJj ZI GaЀ9)Yj$#fَ&kEuЎht;܍f85nZ%=?yD~\Z ,*c4\b],ڱTs >j gQI6Qv!p6d.?XvDdyNlriim@6[ɜj‰h3VKɠKht!u粄<ˍoz5wzPiqa+4sù|T '`F9SU/$d~"̔l#܌e&&̑&cW|2`W _^_'4g`y3L>\Oa m-3Z<ЯK3hl̍;  vv0N?GzG`_]OPؠгkFKMϴʏCM2jF_M۰W~ܩ+1#R.-t ܥ^oV >ov%jidUԍǗCTkI+oFli17CDY'vre X8 h˅ Y,Z9Nu|}|]UHY@dܲXI ВF /h2NsVD\ȾּE9(~~;N.'n<]۬s',;$xǹ.E.Q(ڊAHNY+0$$R$FKl&lG54!P(2 c߾LJ8C䕪ng_Yf--ywu?j)ʐ\urZ%PZ(vxWNLʗo_7#h(1^Rj4ڠӛ!fzzEt ] q6DuWQ╳QgJ:`}XuKhd"p/[!3j J 5Õ&Jw \[) (LEHk1a{G g6ܞ~]/~hCIp^Vb.vL$t;9|]1 cj֮^ew/  fdܙ1E%3KKpNGH Ġ% Z (R}t`M4=0i!cd5Ox %. =7wtcE/WZ ,*c4\b]ԜV6*}@!tmB8DlP]~xSkcSbWߜȿ5 Mn>2 Ha 5SM81"xfj`94z .R\pgq`"Yʼ*fO0R*-@6lRRA=Xl69d|3d0BfЏ$vre)B ED "/YhtswC>Z*\T uBrwqIH;35o ǔHX^ߗ}RW7%%Jnꪨ̈EQuMK-rhAtkJ?੎1k,Νr֓{jw_-}:20bk4둊59S m ~ u7vCe}5: Y7̷ ګ:݅b究?=j1fYJׇ閃vB5Ү>_:NMň--lm /ə}(tTojz-*}e=Y|9X{/ 8U=z#U.?@7 sF>L|__99D}!>L/׭_ 7)-,;!ndD<!0\`Sև_8A0t|nbpf9H:7m>`Lv1>Kb,qy@rJK'rxBҀն5us6uq}e77L4#N_qhNw}ub5xOq۞3'>I6{(M4gӦ]_qJX 6&oμTVFNciށEM ޙ '3Fqi_#'ȪHs=WQI8I{1]mK#g01Cͷ˫Oy? {߸31lkcϖs'>'hFk~30 +7~'hſN~lhl,4|X00N_\ն4ԐLgCk]&Tzmi!BWuGV_LQeCO!DuFBDڝH_S6_g3g?cLlG 糯yx귳\@67%(fqR̺|M{9^L_C"RVY[i<|3GFH#PHJ2ހk]I$$aI9&_Ć9j;j]츇X2ϷQL-)ϖbBqdkBVk#)ݼ+i Ԋ|`j%*.N75yFPu5kb3i +;bbwdHȖ֪*B$܇A =+.;g3qĶp l|>ٻVc_.661\px=yf\z~ӳ561ij>װnqƃ ]|3WxTlFꏳg m{+ߜ<\͕oykCiw;Ṿ oݍo֭^n.579uf~bABT|Nm^XZQ1"Cg y=c׫d.y|Os迀i>`_)AO/'L4k}\k7~V=]".%aٸˑ4/l);s]rً(~Z`H64^bSsWqoFn6oHwKlXs8i?,(.b}瓫o67t0syub/p?ъ|\x1(跈GƣwWצGk}]] n/\EĿ,Ȳ8t}$'=d8Co{@*܏wZ Rjp[3Tήn2%cnP;{;2HN"Z-y<)b2;*}=]DSލʕwBz)Sxp1L&%+[⦧MYƖM |*(~޼[ 7ͷ)t,㦾|#\^R@wznlYBftj=ۤ}!Z \+[W^8 w{W<4oׯojmcn:[5؞j dѻɧ Ie}8lbrOUrq,5ojlф{ bfz7F}jCϦ&֏ ƶyWEV' E1Vx؈={fM;cgel<x)GcBND k9gNQ j}%/O0aOW>_/섵}dCX]"rJH$Mg&)T+NeJI4h[ҫFOtaY[awe<Pj c/ܓ}4ԲYn% y/^q7Oގ,* 3PƑ`t$fOkSNjjxbM4 G3dfV2&ð]4ffxЬnke*uesHY38Z2Ș3ӽ J3JgWQÁvCPB^":(Tds!/gȍm5A,"+2XH`+A=<rOfoެ7ph4*'UV$lr)(\KB. )"Z䥬n5 X ,Rh#}"t#EC ^N:P9Zq 6Fր~,h pNcs>'+&L32Bc͚ JvWca-[dh& \U'^8M"[^Fo^-Vypu5 `r| !{!|*nC"I )vV9Cۍ͑(Y}+z`,R&"ˤȎb2`փMF'8 D! ϝpU5Y!c*)+y* D([s8ǿ/]ӰA\H U0Φj:هNF[,)l'w(Yچl93D; V4^1-sqh4kŅ Ar) Jkd% eɔ=׺H4(![E2!ZG(4zm~PM Zj׉!X-vwi]M8 cW~7õr0~~nk9UZVb sDUHyv4UȎsu\e 2B +B +B +B +B +B +B +B +B +B +B +B WjyLTX= jg\1&(CGbġB +B +B +B +B +B +B +B +B +B +B +B + \I{L`T +W)!p#W2n"pW\!pW\!pW\!pW\!pW\!pW\!pW\!pW\!pW\!pW\!pW\!pW\icrX r ~h+2+ *B +B +B +B +B +B +B +B +B +B +B +B mA񽈍,؇a荭(fnnxC t&&J[oUdBF揯Chן_54Pn?d YmkUiZ~}B>yInfIVrEH? j\/'j< :ycf1 D  8F "@9*{S,qZ¬B.<P]yZe*sŷo|!`g% p"0HY@2'O҃ uB(@X̹ a!J䩃P oa|L 4/ dh|ʝ$-XɍC(6ȡ'#zax5ѕQQF:@RMvѻV^1WlJ5.>6xIPb엇K63:s7jS\^&4oL"4 kb~$f3v+ߧ/.u*KyG.re$̘9fI14LJL^ ro3ʧǛW#|իd <$H,WOlqt7MXt?[#Z̮1-0OxZI<5z`I!z)O( BpC+3ΈȅFf9c.|CTOUܒ{Ǹ4SbVNfV"H(k8nP²uL>.w؄ɪA ܞ]炬՞Ʈ|=l1O/(o*Ĺlpr\aoEx˿DPꯡu~amy?4 Ra7n 'LLIw^L.s=P{Tˍ%HH鷌 sd<0@ 'YH 2qYM%vB.㑧1>`ó}О繲\0Bk'Zo[sVs,GyoghE6df'I`3A8մzH0oa2V0 zYOP]῍{Юvod/7g.f5jZYе[܂Ů<Pqs0"+jUdıH v. ɼbd3L.[ERS,M|e ,} m&l~nY~G>'Vh0 YFr v4uߜm鉣|=.:lcqKL ؿݺFq7ѥ]n?*o tYn'6,4G6ĭ꘶L+ۉt_RS_LmSqѦn_49=Unm WŬ_q:k7ޕY_1?vKLa^ h?ޓ,- ;pK`<_7&g6JMeJLD8VIf` wmc`hxp%9NHM]VuS[uJl63~8x{c [/<L#nθEKJ8ú#W)cBd=F9K*yf̵Fˍvw(>JgsKU4{O)Ǭ7k1^F6o<9n't?ɂ.'"F\j KDZ^TfOtm3ô7A, PВ?MdBQ!{w{S,d1{? g=1kz-#=#ޏBe-/f|dcwXPi9Ӽm-WT5;Vn-HlzsM՟~ff&3X`V"9;x".LOrʹYȣS/ryP&hJW8*|m.}i.*i 2cK6RJ[5Υ(s!ce6J)W2k yAy:%Yu MVVHU|ϣxre:>B Lfl;,;zhlv特V|ډJnlx PA"SI0\G&+}YTYY,>di5::rMfF-QF֖R9`2X\t97`T&'~ ި^[GY%\~wl)l+0|RKIi *;bPy;L8K/\i7x{ӷ-e W*+MuNnC!Y}N WkiAf@ݙp/6}}p]ޢoW_6|"ԙOeހG:O{_C'W vozl? G7-hMǪvw;;`Ѷ5r u1Z-rk7w*Bi`8Zldz8æ _„P^$XF8VL)Zbe_B$$:^Q% 2ZhG.6ggye}+m2YWmv(H1X=9]"@nOYKӤcP&hY乢^̤(WB,\kUңLv{ƽs6N%@ߏ ArY Y$ȗ %$MR_jr(CMQۀm3驪>U]U\b" o3-h@UhWq@(mJpǹ[MyF٪O?? m e_Zsr>lb zAyo@Y"1Zr6u;˝^iT1j齳ˇMI'%("%1t.tE9Dy<Vr& h\hr1-#VyP) @x8>w _EtK4n' k^JN.h>LI __:/KҢl%_ʦMQ_vS&FoOB8pXsIU@Hpb2&&N&cEѤ$R)\B#U2Ne܏Rb Ism)l M'Lgxvc"ڹbnҺݓ&I@ s%4QXձ7cR )8Iiʣ,eE#@荛3TTB:6UMvJMh ;RЈ/&׫U`մl"ПQaΔ#%n4xrKh0O%b Dj3ƃq zR SVbx^}yV~ˋwp;ʷbBzTwp VrO4PO֩ FіĠ]NL!)zJ3J=y֘˵2#t>]jqdϙ![ŝ,~T ֻ8Vtt.td|)߆Sڨ%C$j)` Cdpnv%N %O'ôƖz=ɖqޝfDne'\{lydh Dĝf/juv9;O6ϧ7az/*~s@v\-Ͽzw#)P',Gm: e?HQ,{ZX쐯,[&yLYT f Qm{.QHTEDwɩ|(t7^nv49]~;rt< 4]ɧ)e#WN8e%#Oӄ Զo ;hydkϳ ~x׾8`Wp֕ojHƽl0+Sy!}n/%h!jƽ1DFRx\(a+c5 ݁ (G"ՂH3c U$(O+Ԩ;F!HcfSYj֜I-$gJ:T{*Yr #bGjh-$rF;xY!=:Z`{jɟ_ا;1 h j$"& ypClqFx#l"%swkظ>܇W`#w!r`2?zHA uyB-3ZZIDa$atך}ȼ;\߭'Zw߹t#;k8=)̜)ǥ8(#qTSrXTl1Y/YG]\״ (z@{~?~h+Uݚմ'igY-w=)|;/Te'TC7 6RÈ֎ OITʋנM@Յa3 ▧PG/t1F>,e&?_Z.J|~?P9_4w~~_*_¬ٲ7Uo^ޜ,07FYUNG<)o qX/Ɉv_mZx˴ѦŅ^vמf]5sfVŬy>򬫆vջ3+VUVPՉG0uWe'/}zҒZ6_y0Bm u=%@3m6ʽ&+n/ ajIZ~}ܠ=uU_~<c}{ψ$uh6}&Di5:B7pjndLTsa# Ց%YNbR**ku3`epѶ$I(&* kpkNZA>X>>£WT<%JzYίٟ?ƃE?~ 5/2`- ='ix6AP0W5Qc"!m!~R`z`>>?i5QJ`z2ZJ ̯k-.puI,?Z푿,ydviY,3fK o0? @]ĽO|?PƷf+T]X!_?{%44 7aAc](UZ]Hܵs6UeM+?\~u3Cްh-'~C=ۋ\xo$7>]>\=݇*W@E/~K7-b{isIgOb߼)9?wdo~5X0_,ەVe DwtG T 5C 8FKOS:g@0^K*70$0 *>D Hp1E8=: or[-{hޢZ6mfw>WJ7=،Sό-"CYfdy% gP$<eЖ߯E}e>w#si@޾ut됽z;[K^*{ss/v} _:}}ə.K',a q^IH~,NV a'B,oν4HPE@ }}4!J <'_Z*zSǂu2t a&5 I!*(QD@NJ KZy5T&X@{MzΜ0{}_Kοf>l-v֣Z Y.O63]\B 7\i-E`֌?f{ HoHL +P>u= mqքv0ES@^z|Jỏ۟Ӷh ׵kpZ{ƒKa&(ZP2xLy&e"$yMwCDT68KzI)heM4mg`diUZ:'W KT0!*DHDr:ݥ NzhUc`Sp<TSJ @( $G c -nXQŢ.69f#5١`5`Q謏(YdNYDSabM3195lyoo45!ƓA`2EG«Z˅{K zƋąJʓI*bD*댩 `\啐HМ'{~\sI%Ʌ9VIh3S)A,AV>$e6'mTaۡP)&F%k%㗳rynXrءGBGVkNIk%)PBI##M,YA%]:ɭ$W2% iTe+ @*E8:<\zk͒|Pn~RAOt8ŌR rJx*#"%s!cPONm.Dn<L@'Z).OeF >X+(>$ 4zXy<~O7?xM%peZٙerbq{S49㟧@ XbTStXdl1uZ$BBC"\.V {Q3%CAS) ΒԅMg;Qr!P3` 2@-71Q5OOu; >:Kt#鮜2vR^>_| 譐@+C8^y\(aVvzk+Ⱦ Exǁe"Rr<#:GDbtCxs bf k =4)eL3B+%'HSC,ʬM8KxygXTfOS[EZT&}`]:[o; u6Wq6T@QxAU4qgPtV>T!@Ԏ >g|e`" taP ON@\9IJi }qbAL.n0Igbmg!ձѤi9LvǕO)OчfAu|Al< + U:1~XC[ch#3V׆'nBһ^%nԭҰO$ %d\Y]jM:)wHRnIu|;n!jH㊳ߛjvWtIɛ- 3F*<fͶj9Qޱp6{҆F3[gyJMqN[;6q7RS21ӊy{ 5l[NYYI@'+:Ն݉ᤀm; ɛ;Ƣ $Q녈ALW s¼Nrn')[0Ν;{PNbo]N[d]ڭMFunTX= ¦UO5s$W{%-+Jiq ~\OԹ\%*i%8#7AY+%]j+a)$ E,2*RЁDo^)dr!5E+2Υ "Z"IJ]DRBHZcLH{VTvs!E L3BF r8:r<D. >:HS[)˗9s1Y1pfW#yjT  4:P`u𳃟<@FhF(^NkrQb">+OWZyY򬶂|!aĶPSAi -* rYt...pڀt*ṦFFD t4E&9򰑆ȒҬl`LxgXJ:28l([ $p AqLq9tsZRORy}W=n[tZܠWt#iVU^U4GbM**!jKl9vc9 ܚ-RyCT %O6I*%pMpFTv+^#r(@ 1P ༷T Ap:eM;+QLƝŦ8"⪮ 9$iCTjb4qrˮO-7}j~cj <1P=2OM%q\ s1~WZ:bJ8+KG~I_oo=` {oy$%C xRp;RM/UIXHAH[ďDHG+"U"[ymd>5  ]oCenXJXw.=|ŋ>ķP>i .-X+e< ,mSR$$hkYq ,u&{$XKyyb) {l7/'D/Gqkه]r˰ŜnwU?1k] 6RuOZ;8ϣQ2|Z_cSRقf6MAGz0!)p:|Jj86v$0iŵU4δ`A.66~׸Y:\z~q!Y.[![, mO0aeW[˛?wR0-Qu4@9,!2 Πɩ$ D{.!3uJ)L,zh˲;=U?{Fr /IF`l.M9KF_%)m-p)␔ܼHc8ꯪgIO 4(7q$;Q2z [Dj\;^/uKKXZ*cOij:EinLI(@`4()V$1hi`q ͨ $;11qrx:&YtMJi*(%TR5c1rvk($gGdU@C&w-XʂFܚ8l݊;MEAZ%dN:I;h ;P"^Xm1+]҉T :X\z컫!.q?L.lud+V_e%*sOJ8*b֥]Ho"Yz4ᣱ&Qy "$y㻨2Q-l Bq0]F&aieRJrT WTmi'A˘J"+U¤04C[_ cAG9VIh3S)AD$rJdJsNLnգWu~uقyk#;aK/YBy(B|S.iEvn:::Fus}^Ck=UPdEf;):PR'$ 9VEj]nǣ/^*9xL0ϣR 2 H%傩DU p&q.[,aOv8Zq!fKj)rߌ?!n4P2xLy&e yMwC |Ypv>s:|2em3aNd-nԚ_&J~|PlC xҎJD+W-ىXg|ە>t)>-ܒ)N5)BrD.:qa,Ama>Xmc6R5ATg\Fg}D"GpJF * {jW^ iܲ|kSNJ)ZA+ oNdh9=Aj[XJ%"j:Fu*'*1ȁF$VVKQc`p]+r[r vn.}8 C *jJ$# M=k )ys_.Z:6y)dYwbq{S4!@ XbTS|Xl1γ^ij@]InҠW r#}onմ%ifMv=j-5:,hZ!qH pJ3"vnΊo;g+ЊC;\czbIhw`e"[-gLYoA1q4 "= q) -C\:\d{chDϹIqh-06/FΞ#t.3%P֊+?d=/>jwOq嵬Fr{;?V8&h2rF&/q$# =CK,5:/OU L+`ZKІ麀9YX!5RYP X3TNpatº"1gS΃wQ済QYcN"k@QrfU$\ a'q})g0 j*% 5`H`,U}yI'He9劽Õ &F~~3㒿w!QN0?8s0~Ck.;G 24c@Mcj|`ޅ, [Oofw].+)ŚD'ޚ1_{4s0J('\\i9M&p)* ت6(N g`=u.7:m`3VX 7ûE0f)U-Vo#!\okOݺZ4+ґo<$7U ("])oz4R.m p=T9B[yf_kw?gW~/p7n>x 617KrK"d/w:.wE]KvL0:giOM4+ bQyN> =yӟn{8c[gedlY)|I>ꀹ%_G㐇J7ɇN,q^F1|!jNa 7V!1@_|8=l N_2F) #fu8?{?B}xߵ1 A'\g1\mٟZc7htg)}f%&0QP (ц*w{90hD)E:ja}$G@[s4$mlb)P*9kyFJ HN1w_ɾ;S]#xb5Bt<:_?XDFy 5 ksE\W?.G\ B1WP*j/^n^5y͂׏PAj]қ ߛ]eqίfMg;ӻp278]>fgx͔:[tpAx.-~F 1*=VUc5e'TE7@dZr7VV*a,~ P. *ܑ~S٣󗚌TEu }IZTН`ϱƫ[S:YT} d9 ~{fN;F_Ks%]M'7g)7ˤ|JQv;YNKdnW p_4yZ4ٙ-ժbvr,!N27Hf'`N0,NB[Ă őbkB7,5/nL-,Ub[.}@1,ۖqYI 59i5x6msɬZ0&)ߖ^m{;\C7rkY rIh#-5r%mɎ1W9ܓvԈ=$a<¨ev"b`xOOxsd&fRǁ2|;n&x2JKu&:s׷TeCQ!݉ᄷ㤀o DH AyxOR`q) ]"Ec39ege, "a몗Z/!Tؖ@e*s$|m8\/$r6sQZƫD%9u&(KT`QC`u.:ia),$ E,2*R9F QA`̚KIn-$I['20^_&<Ưٍ_QCW+.5/{ޤw|kNpl=k|tDSئNQy5n13yQr&1})yjT  4:P̶g?;ySK/$ 2pP@" $Z3$عn[G#?\"Wi>ҝ!ͯh~GKc9g_1vYx[/Xs[;odxǰgڡ6՗,6?mnѤm>)ﮆ05m:p;-+с{%Mޕu$ЗI}r߇ؙIg0b j1E$k1V?")>"%ef__UWWs+ᬪ9.h|יl iDw٥{'[]KHouy(Id˯){PXb-Qy "$"ڣTG`KqM76.SP];FV_p[Z{YKw.Ϩ"7-;o&Rڛ^?o羮f]A^|mxۂb u/dN_ lE; kT-:ҲJu05^3*}a%z}yYbE-1!wU$O=yIJ#}zЩWVfA;4j<8q,TI -<Pª:3@6.|J-b@i+ZY[sC) : z0s05'UjZ^E.J0H t*R%VI`^iгr8%'k*frOZ\xX+e< ,mSR$$hbf*fKq\Y'=B)|DZL~iSlMmѫoޑ'\8WLH`TFg H+Lpڵ!@WٴnqYf2f}Y`V!\|7\BDg[t!ޓaEUܰO:jyw=3|'?MG J<>@*Gt4SJV Q іk+nH$=y=Q&~Y{Kl@#iOҤ ,'\0),j8jN:ulc ΕQq6?iTc--5B 6i`)3> P5q0!لڊ;MEAZ%x崖ZN;$=i;S$XM1+mq`L+MS(%&$9{~ ;?o?Ǘ'RnI7g/ϔ%DPMT)Yrp_g|lx>yp#404$UAF`pIJeb;_sFĤh%#I ?Beo!o!wx~:3]ͫ®: ވWgJ{+P$e LSMIRSR 5%ۦk;'R}LCr]HN;m]rz]ohI AL̄4pG]*-l '?G@!3.J}娷$UFFYEJocF؆DO4\0@8T4r4BHTIQQRe3ط5(W ݺG I_jw#)?;tbq{S49S8(! p)b[Pd\޳unF9oC! LMaˍjHTNtFOu,l#p)^ʋV>I Dk2DJU7[@?V'yy LhF9ƒH}ThFhDbC8<1LYυJգWf3ʘf4d5WJO  kENRD*8+e)z{&jjH ZlOܶb׮}sFQ!9BfR rExxKqV NbeU09 '[;ydpc8zy ,- fkۚm*w0R{bUTi |AG_堇T`Ei#ט ,jhS4h@plTfb"qhPFiʐM$ ik#s~&|JhJ{ݙ5r,^܆pڀt*ṦFFD t4E_yHCudIiVFLT&o ,ZGp Xq6y-f8…ɠCJ\l:9x-z85XϞ,㦌 mI f,dY""#u^>^/HI8ZX%D@1Pq `ѵ[˱Or ns!` 'BpCK&O8#*GJs9Zwȑs'"޿t{ܼ^u O얮Zx%4Ʃks1^sml]azu5i{hK6-^oͺClYaYݻݴwzmiz^hfwr/-l+lXl#gMɦ7ߕjͣVUk%amls8%me^D+xN)aBfh|%UUi XScX}hqzk~4;3_^.HҘĸɟ4Iy^X`q) 6jR@\&hSH wϹIQ-Mђ( MgGÜOˀ4?^>g :D._h:> Y?ȟ 7xZSAܠwqw1qM r.|U}yIHeiJҍm&厵Kɛt~~㒿w.Qʉю4{\v%f;> %I4y>FF./f俳.ٓb=_LOknzwc2F;4s0J('_h9܍Fp{_GcWгo8BU].u Z1PO7~ >S wm"Ďs{v]-H;ji$zGE3JaV8uKy#|iCx2R]giM֝IfLR?f|9"8[3n'st׽-{(BR~3/㿾tJ~$EoI#]5 Ú#Y;"EF2 w0b~lGcvM3qTF6:[dӨMk\3=u2`#y`0oF(R'{_sLN:ȱܩ:UHtOwB/ϯs__~B܁1q Sǃ&w1)547К\twW\røn|ӺEsZ3:|tG!'}:+*\C[Vq7,g_i#_r> .YmuVJ[b-B b@_|{a986C2LjdYlNU:~n8=Kn q IND+#1ˠ-]?t\*8N6, 4m  Ip罧!:ivK" Yzۤ5Gf)H!m=\Y #r(rhRt@>3'=*r 5ɷQs5R&\#dGrh cy{eq0cg1V+αVkP`qu+M *W.Փ}h.5hۻQ(}rw;w^E@ UE+k6aҴJ2v>(X)ф<@vO% R,~_srɀQͥ˾gFdzv}!+Y]]6seYM7'7t eUh[p[/v 3<¶-_6e[\Ù-cy<\[mGf))6Hye@һjj'mJ- ~g Mvd!m[Be6>fi)Ix+Dpx&yMbu;::7lS#-鉖[u`c YۛHej6E*$e[~UHY8Nt:5xg Z-:3OX:Z{Y 3wx_VQ'D!Qm\ZwWꚶ~7? +}]ɁF=z3vkmI j2(WK c9I/ 4 >iSoޞ^-dwc p`B?:v$0iŵUY9ixi@svm@W?tP;`(Fܨv(l 7sgY?}zݯ^?S/z!\֢Rb:EioT$ 0x+xSH"/" ,Dztbd2&&NYLr/,:&%4DEwJ ҒtnQ^da1ʸ,4eieCAn8x\ܲjƽQOn\7߸:DQm-Mrsn!ȢvCFXJl, \lq&W:(ᕈ٨{ĄaU HЦ.6%qy(RuWjR۶RE(^i!"Lib I!vABYɭz@ XQy.x#@ !hE4QD A1`&(R]E1H/ h bӹ]F,7 CqV">kXBXYC QZ Rpjʸ@ #@ &4r[p)׾I[Nk99IsLIp-|Gu*09ƭɶ'jHdg9N0:G-Na\}d^ 4H9pVU\Zb`B6X&|@z qG@sJ>K,hBQy "$yی2 d q^oxp v1m֞V0;5txܽg Cua%z}yVbE-1!wU$^m!C ¹۽3EMov?رi͂vixp"Yf4Z2 )!9(y4 \٣u))\H^'mmT%20^L+Mx*]RtnTJwzW C_q!Y鎾pysɈkOZ6~lh{:9۸#e3-5wխ~PIEy<IΘJ"+U^ ))kQCvvvѼv\HJNnJ9:D(ʒ˕$Sݻ:tj2Zmd˕dgZNkm6k;M{T^|»U9hO+jfF_}prrtx8TB98 LD˔1Y a GCR:.yR𠢷XK'NNA(M~h[f?&%JP57a.^6\8AAgw6~NHL&sgЇ}3bŶ=4'B ^oo"YpB'y8('_m-FW93ؽߦt3f5˅57[G5|UD^gM3Pۙ΄X;a4W/qu&m /.cNBA{Wo GQ:\*ϧPJag h'|޼ڪèٽSRrCHVՒCHv+֌6qN=h;dv@vvԇvg 5F-iu d%Z*Ü˄lKŽh7pK.\4?y6agZS7^tkQm没Q/yaC[ cik RE0jD$6M|EZpq =R/|S/;Oz/޽yrWw7]܄^qc?xI{Ds%xէ׿|HRp2(+W3W)JƟ-oAV ƒH}TPi0#"TZ QHp1:3.|娷$UFFYEJ6>n㶸G[`{]x. ORP!D]Pˌ |VR%Q|Ihaơy:%"scb~lEYSO'xug !G1E;W{Q"X`õ*[P c[zu.GX}̼О7;ݏm?PmlHXYju;]d2*y3܊6NDmJ@|%UUiA;+N$쐭{oVΈ>J]Y߉%ףٵ o5xre`3eE$Eܚh$</s V'ǥp$+[(4z5u1@!ш0s *Vh-06/6- 9 $G% 2tCW~oȢ]y_gHxqö,G^?r{;UY8&h2rF&/q$#  ;B|ѷ"Pge r ֒ a.dhmMFZۭヘ]&\3'_)32nC\ՁrhL2Ejp,mE1˙Up%r9e-?jچrV+UT0^K*)jX2̫D2u12RSΣRPr]P6R/%oz9OsKޑ$F:Yʼni'Zs٘="0 >٥`(I:ߥ̻1ߔat|2`FL{q2RzznLOknz?;+dr8" ; GZqNy8rhBz^k'R@#VyO׊Ic=x 'ҡ'_AR/!N qsjA2]%\WK|y]~QER+Q.rufB KgB`xr &9;MLnnFvkrٚɠv=sOdm^GZqۃ+Q&^3"7$Λah8"GF2 W0b~jУ13qTF6:{ɦQޕr|Q>%?:r+uG`rrCJ iúN^!9_ޟ}?? e_G:CaS¿451OЦ8lhnCkZ ͸+7n.O uk@ˏ; "ק?ZkNZ@>r؉഑'N>?Vg|<" h 17na#=$8Y< 8ye8U RsFԏpv86B}vsOƾK3DI0 x$9YFb A[}~Fgq6, 4z\@6`T@@$4΁A%Ju,ԚuIf#yX5ǯDJ&ˆc1J9$inC`|y@m\wu%1Zp|yfXb{% 0gQX+/֠6j?Wمj;h*q,$h*R.է"Xmf3 T Ui+50ig8>3 ϯ7m[}srl~sjxvf_jμч#fA|AR[u{ i[.! gۅ?f-cc 3«t'Le#_52f˘c˺:3cLL1#DMˊ^%NVMN*4l'7$ i2. h>¸r$V\୤[ǷӎIc{;vH"it5y~J{K\]7ίaaFHg;T},qoT,x[t3ݛb4e:sz㎧{~n"Ӱ@ӻn'y;[w2Տ6 4SGܙvBe'Q9IkQTcg{;-8)}|y!(CIz!"wp<70I 4y #.%zZiК?bqyB\6zOCP=KAt^α?:7$'Ar:Ňޗ. SI!Odo4 XLcKq"nˉ;YnRlʷfIJOUu "; b./LRR7zgz}YJ,2\ $dk~8%VCm\WZ*xFKhm׮AI$ιTT)k6nzذЯF%dw~]f/qѫ?dw|tDhi? =/1 gjsZW]SC"L 2``Hr&͹]~v QW+*t hƚZeVJ."Eƚ{m+Ww$̓ B6C;Ky/XmnXuxm0XG:i0NgDH&AAYa eXvfUUm{7&/u.oKi2`pu .ٶ+$JU5pvk854};?*JXS3'bkپN٧1>~QxWrH*11sL" Yr9vؙ3e)ޡ4`H m\ X!)05DG˘+ȦZ "rE'=Q(!bb-PBilT1l6=6823:7"nw}% Oi.}&XNx9#{բ:D1@0x3kA_CI! h{{G}f釋*HOݧ.Nz(EdKJ2I$dQF 9a*dfR!Iq266)$)Vo[g~ig6>h|^׀<~OWk^22-FO$Ϭ Ҿϟ''ד-uu2[~,W ?}Cx|y|snYhu[ێiY>Z֝^\U aB'}X)!C Q $<b{}A`J6Q)5(ByMZ .Ȉ$ozN{~{[o_!xef@i'ڈ}牠B%/1-L&󷆋tD, NGs"à:@ˀI!臢 U:zUKp3bԧvε8a2d%A3N"54wJn6n%xwpپ&=B+:cQI2z|8\<[i[jwDP,<a< 8t\}N[%&8 /&@Ȃ!ӡ]N'iQMՙ kd0 mx?okҵ-:$st`Z2/vj{'CT,j&$ɋSD %H-LC5DdʔRAl@(W›( +Etq2d"7VDD¥; ͆[Er]%.o?ꞭWl3.gdz/^KS;ߖ];IMD6$JJ&u#@os-㑘d (Ǧ8N.17R6U]),e# )5Fel6ݖn$dƶu[x=dO+<7/Ӵ(rOx|~=>^rMltH,\4A NX Ye/Pb`F86< Ĩ@U &`g[ BOMITDuL|sͤV7ھ[`xc`R%f]b-bxkI[VsUHרd dF=,ؐ!32,:1Qd[I$H:Q7BEgzȆŪ g=ewjEl&Ru-ו # MG@cM6A$rT;퓓-M-wRc1i鋩â*C2hȥL +Ygcpv[U.vQv6Y7/zJѰZҋX}auA}Og@[w$֛o EL5W- Z\a~6hB, B\ANzC28cg XmPt#k’PP|0PF%ٖdCMm0iurZO{( ms 3$lNNW_g&AlѴ J <>Df.sݛ6=ӦMrR%($@ZKa d$CYqt^_ws&6*A,] %c:$y&Q̖E k%((Z g6˯}uvdk~":ڴeY1 HJӫqo[u:x%^=*i V^]t@UFe~=?vD@ϓ6DLfgZ^f~w旟߼}c"d72zEӛx7^>^,ybJP] +ʺgɜ]9  #}LB:^TOZnfpnjѥ9vxk|ەOE4Ձ 7V2w6_~~W75}acd hl, 5g8iaL(jkuŪ2~'0RhPeu˳~oy~>]7;#(Cz(Y%d=PJx-d};PJCz(Y%d=PBa@{r̴1: Xz \g^=@z \pm;EZkP [>_~=6Tx-ip¸y!"0)WuǂߎgpriYdKN>{l9Uǫͬd%o-0XD"p-D *r+o]`Wٖ; '-w8L?]&6h1puy1nT3NCQTD8$T sHM:P#IJjp#/!`Ѷm$;kn>Ϳ 2^yf#{:^4p4m~'yf7ݭ}qr~=D_ܔ V2U|%Pk>֖ uR!A* 'w$b&4XdcX bzb)Bu! Ng=dEF׶1wq6rg)J'b}LJȨ+ 镳HZiH7%kEKt]|<=x=}bug^J\|? Q]^!LJF T!Vl|~7PӔ@u@s?ݎo?S1YwdmruݏtDk͹R2u5Y pYRP'[f'^|W܇U5FAT.6Ai&XW!ܡs IdXuY[65PZx5y;+e@tf7;~Q} TMϨ8C5K 0ʺV@)I 9o!g -`IM`PT'IXK>TǏ؄HEO#'"Rũh S!D2,"Um: 7ַݢF歆lL?R?w3hF?r?բ\u6V ]ίk%$$,Taz`<2=ֿ</ Z@odH:&9%:G'HHGa:8r4b#xvDc?,-:#( KVI>ʐ ^7Q$ۇѫpEV),PJjv~=(&t=/03$Y}ml۷RokmXe/Ipw~? 䃣n PLRR!GdS lY YU}NwUuPfpuz k: $_tm>^ZqKHRuoէ~=m3VuO~7-5vӘ s=y=TߌFլ:;7{V]S@.ڪ!HZͲEa4l`ŴG96uNZ \뼓mSIi:sIs'.}5#~Op1BرTh}AKdM]`T?.g?2|O^|쇳wz70ekMm}NyU\UC}b8gR/z8]R J_K':IN7f 3 j~Qt"VsU]ʄ(}4x8G9᜼*14?Up]o#2DFR;_}|/eG-[* k;s\YRi8KofՑU,vhbl]~v>>]&#@C<Թiu*@HTגX\&yKrP{EBf?kIf6iqH, (` L?Wt#3Q|~7(sߝզbIUO^ٛ9[%u]\=Tu?"@ns,wAwdY(JX+!vYO~fh@w-s*e$w0YԋK+ 7 {^]u^l͹E1s AoҩҹJ*.t6)t j<^U~˒Wg?2r%3*Z!Z L&o|~.ZЅ/ZEB&LxV:Sym۫ѥ2ui{ƚu>fvp`7T<^m_j[%.ߚ ւ[:l,U0d1TcA6h)HQn%A !r#ƽa #sG;LB0? 98@%wZO'cBL|%y#?y .P*ol.,"xZ6|qy$k`ѵ;jGɜRuY.JZIRǭBື 0Lr~垠1l9xwD 2b:H@5qH*ɬP>a+}Ev9CCu/P ‹iLx)O ܼtxPN0{уœ`Eڋ@c@FEj nELrRTP#{s&ShW˔4 -7_t.|&5WJmJ00S7Y 6Y0Z:6n);\.'?U[ $7v0(h />[ w|7mJLtZLԴ*"9h*Kp$5"kaq"Ib3 y>eΥX[bQG/)3=N̂Gv;̴+<ǝ4 !Cc#3YgUSEhAdbdV/K򨒧e-̔#z#Bkw}^ۅ\ZFq(aAUNh$2]_il?خ]FPU%\QM87ABN[B4eGsL(Q X EέRȰ, Ƹ#aR% јfkvt e} &a= v \>zH&zb^wKL5tsoEB}'.`%\)%3 \r%fY_?{#[vU4(eV1"H'6`}D6"+}AhL12axۅDFVyΗ= w9XΏnz/*p`J@8(CpP5bIAWXzzTịUh4BkM̨3hD刊`0.;A (Utdlٮkq㔞+])xW7NQ1 g,9ȖY^ iO^ i%2tđ1!WR. ot{3#eΠPȰ*iԑ W1`BJH "JHN'J\gŝb xL߭:o;cƌL& ĹZ"2lkXh6\rl1o&]~Q{Lxy1v2@n-E7pǗ&`nAXiƦ{uI@hjz̀.&9d.3Ʃt h;[vM{䛗sjK^zs<$^F;<0nW/t)|Ks-:pR7\yvk\xsus#qsfs{^?mtMeO wC.ti]o={=EC!fH<$~*(nեmy3c44j Hm{]VbwB=}Zm};7MCQ# %xV[ Z֎uv֣ -R i-$a a\¤d?9VD)  KJM vȲdd/g#]*`7R'U6DxSj("h,##( YU`[j @vڥo}Z@i6Pڊny_{HGirMc߽cxhIm8/~)Om&qyå O=cF@1g  NVFo3f+qO`?꺮fDz VzBZZapRNs%񸤒ےQ W HF".@`\[7 Vg0 @, mL=1ފ#ڢ_؏=&+r`ug Fnsz-'W3"Ɂi@iBI1GZA:2ƨcYOCR"jxT~8!,$B",VJ\擅lٮ]%4LV"g.9sf3p/%Upu\9XMC# Ș F -}Jd!Y-PEi{^(#LOp).}Ԗx93Tn˘9-f(0dj Uf[z[x (+ݒ<~1fyt.~4p0=q0@l"8 "ʈqTQez⥁?48mD^/!? J٤B:`^e.D%9-<M:v:ֽ&`s$Zϣ#Xd( . ޜq%9SJj5i $=\HCdhDl Hʣn (XG*H;YUl] XlElRfw_E䉦I-C c@l(Rxa[")dDlLFHH8pF"HLvz "ZA9L 5Z2[lnW r^g6).vvGmv7_"c/` ˫w{$>1" >A&b|*}SzE+[\׆ΪM0^ޞF% kB ie< 0><Be%m +Z@? >h 4W%0ޕ>أ2r B`〷SzSElǭ+]G{:1:"}9( *UaNIP)QԽT(ŏ"Ipܡ'){-y!ek1%GL+Z?{;j⽵}cѧrxΜVDdNQ;  C%$\4JD%cq*0!rGƃ`#EW HD(%-c^QgE`gYB dEU<>H!9;iY"vew*GK9/5Xwv n%?uJ^?wP@Yv`{t l䏞LAkQ:.<8e] Iq%Yr'da@ g#\ / zzE y>7#8x <<"Gr*Tp!!9>1PBnF/er~7C;g#>ퟧ?ݙ0JGa U"(42' KSȹ~LTT,`oHN;kd4Jc,o5ɡlI3(\,2Utbqg9W =YmA7mT~b,ɲ2!,nYv!0f,`9"֞ܣ良AȘSdz``9`9-JN5IB"Jɢ%A"#2_:N!沴(#F)D;MMq .x뜡G%, A!X#ЌuLƝźe ≗.ӦiS;U˱sɭ]kr@hEϷzq>eݣd騐Kmn*yN-s^UUZ˚Jnl׷|ۭ*5.+55*QvY=ڨ4(Is9edW>8WExw>ls̋h_%s Ž96Z1"UR8;V<'{X= MjCIh Tѿ3s˺ƙ(IPBĀ1q"x︉ ldQ*i!Yo"EtfB)=e(j2D&B24CBźÊΏfi6!dkT{-??/9mkYk5KT*9Nv0ôׂc"7AI F+̗@%|1"w޵_u˜Y7r0'3%#SJ@\iD%)N@J5H+ i\joc5u+ɿV;.Ajh.Gn0^5HN:}3Pc󈒤*u{`PfIBӽouO){@19&?9E%;S%(Xs7x@!x~Ǧ-גDˑ7ZOHDUGtByt]DMwGONd*65WԆ*[MfFt3b[W_#?vq>{UMFrivw)lrYtҚW)~+; UXF6?oUhzB\i>ަ ׈!YF&Ֆ]\nb iJ=g$OSm˳%}}A)w4U!VD9<1"Axe׽s8i:}}lOSԁSrT4"“(}`s+Q vz f]fH<(09lKRyֲ DQB(`FI:c"PGxM+[wgvaԑ߱Zޱ1|j n\=΍+-7FtStE i}nDuY:qvSjP-Rg-hM)ZJ.hPK8O$9'c0Qrz]xa jX]+?]C#]ӬB,N ,{!6䤪HA|[vm~LOzM]dw]8`ol/佐Wx\ر k 6(6Y -C(`Hkl45OYJXX1!q$ٜG_ .ߧYykZQ%3P%h*D"PB"@AБMF)owH[ޥ9 j&0,>XSt4U)r-,#枘 gs3>h%*?,+9\՘! Q^kfE4C  C݇!؃-6a=2Cz G6"XFGOBI 8CDN hnک`Vp4Zh6}y'sNÅ~`[O2f~^_{~,ݴGڞA?pDpZhaMXʖŢ("֠ڵQ u8 d2S8@Ѐb[ Y{sxb/x /rx̊5?ED@LG oHh5t@ j9RJFr^>JDXPL"iςO!Z橏!(휺"R;Y4Λ; bfpcp*F'rZ"(*uL {'uaT$Rڣ*Ϥ1iQ/s$FnӣCF%_T2J 97ȥ!e9b1UHƱ`c)qbTTpPHź[2nG)" Q}e., T^K9sqmRp, ߥ]DzՁxV^q:I}lyo{T \斈J[.#|MF,1BPYt)b@zN fv^vqga-{x4y0akP?c'VmuW\HʂB4`CYilհ:EG«sHR}*X̴qo,ӶQ6A h΄AAlQ(#$󢂗բ$0<L) 2:v@7a6iU7*KSR3K~ #050rȘAWBq]T1"J\EdQ_坢gC4} #-+Vo St ^ې&Qk2SL8g{g;Q9ϣDpHӭ/PA}UUnY_aFJ PT N@4 5f|Xɠ9!6F$tTSσkT`I5!ƽC.IA*$0G"` k̇&7 vS)nΝ~<:`Ѻsbuwov.vֽ9o=lIX8jTѡ`Yi 7\ܒ !WY$󟢺g̋ړ}WmQx5],-u]q+8as*Ȇk}{9𿿽Ǐ/ JOן_\܋3ݽ5V $]-]@XtbjS]A`0AsΠ8]֙Lt:J{~{ϥ3kV,64 ?q&Hxj1xZZ91db=ǷAFL# ]<+ѽMfdQd,*H՘ c (Peg DtDL(0#r!XP2XG 2 8C *$xN* g͊ !5ņVIG]\>~qp\!:?r/11^ڷ4H%"ClUR!Ts8nѲ(E"̄XHu>kV`̈́%n~60?{Rjz^Ώi"xGllܖxw_0ɒǏ妢 SǗefɟ\fW}LzkxƸ'zwL1-{j["e θdb0ZJ.8 .gdqٿV3$+G!-u1ȌŻK~6}"{w'L{~o]σxw[C2%ٟf-|*zQÉHSq?uu9kGse"ah#tNka5z&޵g޽7lүv?'W~i/:?;~z1bjW9_ypp2]Fw'wßT4'wtՍn-)bIkh,d8Mϣ㣳eg]g-u]v=+Fyrl3)OF4}>&{'s[v4 jId_Ǝ{/?_޿z=뿽~Wf cqI{ou-Zkk>tW|~][B{kk@R˫ah˩2H{VVI ,Gk~+4lR-ӳ8moJ\|8&~SN6ׯn2G]#=[=&f+p9y VOl1si#a}YAz@J&V!1k%Dď#?>,O7s,6)hU2=Q YIB1yޣ%ޒ@%A${ZnZe{$[!yU1 89WV%H I D`$ˆfc*Pc<;?ݕJid^Үh#22XVj5ӛ֖o*]ӖsK,4WSwٮ.Fw|9>4<|xt=f3;&fLx!LEG%#.VN +ڼ~t&ݍ 7nxH%X8;&hzjl/6FwbOĎ& %Ǵöo`pA_=/{n'JgVXш,)#/ K[З,jwupx59{~皕6*P|g#ӖGɨ)ǔ1Ġ%5l] s@ sV{W CSRd^R"V".y[<3_\EϕHAsi(.}'d9FIsϚ`vLG Q\}V֎R.3K.JL^3G4ޣ\!hJ ,&r qN*%\| tX&:(X;km,^ 666T*tm=3Z{^~t/Ib=)z| cVׇ>3K:yx\wϡ[Xn'p yD30~𳇟;? (;6{#l(UJ#IFX#eN<ڠAھTk7,K1&tzYki :Fw&q4B2D72yN47 I?fn$MYQrtr;sDoˁlKQ،qT92_Y g߉C֓ܕ7t~WN2%g,[QN]92ga7QD(9F  ɻ9̱g;g%Pq-u % 2 UΞJØ@Itq:`sy^w! l! )W J$1Y3B BWƝՆ&sg6~G*ürwz:c3?khp;$Ġ_Qm2*~'gm0;teA.$`9vh4 sx#Jf frw0"':gVʹ]lvjZ\.u7:e*k<1*Tuhge[X^R5c.ǟݺnun_ut6 qݷ'ٻ޶r$W<}^}60h vg^"EvKʭ߷x$9c]"S>@w`aK56his[6Iɗb2VY}fFlIA]A 1ת!Ē| j<.~!!wG)]D&1jL0  E#9PS>뜠VˬG˵Ǎ{Jݗlq;xC_۷Oƹxy MFs~_?񎐩x9k~= 7Mbn9y74ِQ~_f}&miKqcƾb6Щ{6E`v`Z'bZ GcZ7uja.дļi@3+f%È%֬$QqE:5xwGTr1z(To* ]2RTL5Y:{8-4[Bt;xK[o|Y2'JGfDi'r&8PQG>̮ƴr0^..'7P]wEA!&ʎ$ ]QFa[z6L)iS۳;kYg$i 2.E(L!i;]fjf➁is{#˪ϋ_֓ ì׻[ų|l^=vꝈ8{Xߘ" ̺٬Y"R1N;hxݑV9m,b>,ffal1jkEth3I 6(+!ߒ+ !C7@9+:n6IJh k,OIAXuQѴ4"_`jg4 #@])%X @D(-5ld)STI&Ea,4>5^͆{Ŵw2-8(Nz9i?If4E=:El|:tIYCD.$JJ&kL@H2ɵ2Fb"MzN. \T =|PHٗQ(XZ#cFv\6C7B?`[w?Tf -/r_էltCa A=Z ZhP1i}IUvӔw,8hx~ UŬFTZU%˒S!NP`K"OZ g7b$IsAfq(jƨ6i$U0c@܅ۀI2WD4XjXz#23C  EƚL"YE&G/Ddғ?5n$$^T.[Yǹҏa E=cgU6e[xG"s|PCBꌹQ.2BBCC ! +=v{t_  HG G͟ޗvt"v :VnjnVVOoС"jE]k:oph3С#%-oQRiutLctGִٖaqnÕb.mi;]##Ii6]]&0 Nbz"X$ܚW};6 Tx7cpP ;*vV ' &$vG8m
DվoGEո uXԽ]tJX]lE&.+<؇!lcB1eQ2XTM uAyB0o"'tbG>pZD^?EWZY7O%65ycB(sɾGy+N R"NZ[P &K҃LL筹1S 77 ۅKWuz,9)wwf3k_ iRZdȳ/͎4*k܂![) d s&9'HH^[04oZ>G|/02!ǪE\\X|{/~_}vgË<~>InGo!ffԭ Qk>AEUJw!Ӎk_C3o wsO,p}P'>xNCN̆9bгr8}kXO斞)><%mRY[ $!C&^ӥH !Y.yp3q3}kqô9a xyzazwxmݫ?PQ"R:[%!3؁Li@Ӗ@R[fl=h,},bm&X Q:R S+H$/jJήmf#DG/QFsATEL4e,ulc`ٺ+\gpxLCc2*p5' Ҥ rJ(zhZa/03YK SҮRF& \P"j")^F֟2Ed"@^$NiidtNԺl^t1LKi!+i-Zxd>](vrK|^_oÍKΧ_ MCI;ӷЁmrGk;>ͫʴ7{f4EydՑK"r$QR2YcD9LhEb")2 9DQ=\T =jsK@#e_ DGJbaj͆qfX2B G›{xC }e ]>/I##6136J,V{}(}IUvӔw,8hx~ UŬFTZU%˒S!NP`K"OZ g7b$IsAfq(jƨ6i$U0c@܅ۀI2WD4XjXz#23C  EƚL"YE'˫1_ɂץ'ccc'\1uFA7D?&I{xRgżE4"*.Dgښ6_ae@U~NT}K5W1E%[> ER I EpőE3AwO=}a e2+bEAvm-4e j@"0FmD3Ax'MDinѾJ<29m[IϣR\(tv`WT鴶6d^:=h|)wYGd1Wn۠"tg))doϷYVE֩j2֢fi[Rѱ'\'w@^{{`#g` 0GM,)InDh:vF6x&y|Tv <i:)4 (A>^wLƯE El`,3 g bESuإkܵj>V;B-&Xb#h LrgE Lij1vyJVFQz̝vX ʽփfނ/ 3Bs}!9 R/?b27ȖgC~> R^L0/qyH\>v ]`0lt#٫R=ث_J'ۂ"f3ld#"_As]<#8$ uG93 ,A;bR8 Yd l6iZjXH>uCbh\2N,s$kxk@'NSsC]mvB6Ojx?o¶Pe[7_ڀǯpD$QK*GGCYS Em$sb-g6B߮V3!8é|)CX+ mo(5о2_N]8L0fɰkPDER򙖜eN)bxЄ.4(y)|?v-hOTYw6D'c`uYDIuZs,8 @-濷G&蜛|a/Z:6y*o~Y7Z!2ZWb's:Rb-p(!1`vMwK`Vf@\ 8cDKf aiM& Hi`;ǩCq%"njIQPJ_Cz$zԻV2wO muKT$%[R!2-u20ѩ $) rfm:HXGWb&y@ FҎYRʃp-a㷀tq/FNC1ɁGit`k/}5Y˻O1z;nRp8>׼hK|C6hB%ʀ4шR5 @JHe@ep@:;dL|[S2974TxGDJNcRYQ8% *͸Jb,A"A$(< (Q}kڧ,|L&H}4ʼZRn)KaPh;#q& ^H=/.(+9B@"VĀu]{ Rt׭Õ y%lzc0~gqI;H Vt?g/C_Z$%gc򈜅Yo9 MC ]|`ָ$ ĸ{^v̯ "OJ<؇L; @rN!LəbNL ;>g/NPgյO%P0 8+:mqR;_.*l&W3MkX;J=&. Y"ca:gz?Z%XWߖKG|y \}a<`Y:;q)oz4/m>]^qMOoLs[y^y{r3¯/Ņ/g R!|ppv^-7k(Vv9+^bqad7$31yiH4RN,A} +V1z͜l)];+Ag?dݬuϊ$r1k:\Fԥ/s_<}|{,Kk.=7;_ߟ}旷<ϻ?9SL>}ˏV`\ %A??Dxܚ׿aj0|jX%no2/kk}-v{k!@~vǑ3]:鼞(NҡͮYtdF_A/ N|\5DAOT ov>w !P~NXGmn%s*S qnA%?;.2KV(:sIMuOM~x<1~۰N#cPa ȼ gܡ \dXk+c9qI`ݡ#s }$9{}N2u l"a=ׂJMxF#A:E-1T+Py]NWkJ2(mr5UrPN:n5)\\w3}zjUOyI-Es? C\j8|pE[ZNVb`l`n)#k'/ZF PG p3NCqZE"ʰSԷ&G8y|/uմ٦GC2vu'md7 ߁Ȗ6xC,Of6zQV{RC"TA4=S{\@eۭMӒ`qj송`%r 1MÁNԟ|< f2>L~R)Ufy?7O˄ f>2NHzf\)ч0)W TգU FS?Zֆ'M& ixԇ׭|݈z{Pkc&R.87Y3$#b=u}mX`#w4|.2\V&ҍ&B]Y7rIQ.x2)"r@5m" oVDD"+*Sby݅lpkؖ>rL~0ӗaVg7[c?ϯYϚN=h>M65 tg*\ -י9:6wzt26=FFpcZmMLxӾlb;zDs[MX qgimέq_ƷT 5H"gL }*B9"4(9+b> $|Iq]D~\4=pΔZ8q'"$v(RcS|d:묠 Ĝ"upJ(:m:(L,N鵢xUbnkqEsCEulpVڥ?RRӈ(/$xI./&-$cc<ׄfs1 slPf G ]pVk+xmjA$!@/o&LB!煎)iQ1G5R(rneZGEdQ0A[AU&/B"۶n[#gu :[Ϥ`k[\`sh0t`l[oH&&ll&qTEEބ3B}-p`(fChpƕt) !ugg~v[<*I2 DؚHJ" 'Fe )lWj]o| [12p[bVhe 3s9;_49Wnߺz&pgHAX0K%VKBס` \O(3bIAWXz$wh5Z`(3*h2:Z-Q9b0 NPB2-Kpkvshh=b]ŵ'+_z܂k0E(Y>,.٧"Z@H(#d9@rdx;!9)rt4vVHN\q$D,t­VΤ1B<UX-zg՘ >%&QHc--۝99lz]|`R̛iw9edL>k5q+ #L1(w6r)Lrڅ=̊14K3aȭy≺lb!;O bgYywi*2'jWVε]v6R#/ݹo~uwZK xFH 1,E{U[z-͚9MB6C( .:~^j/!eCl.O ԟd8.Z+Wqab/eHM&C@د$=EzڤHv>>RV},Δ\~_˛yyqyUN0/zK IJNs#)"rQ&1aH%vD{ɓSEXě%L2G.5U\,B4;iWݛ//@ w//ޘ}a!nwmB/YUT1C<~T?xSՌDx睦m-L6@|Q&?\XU"F޵5m$뿂&[47Jb!Q*1Hl+[)"HZ K83t|=uWuM"'r-R7[ئP_wїo~joٔ.z ns~' 9T=PCszUϡ9T>>-BC{{=PCszUϡ9T=P'ͅdgt໿0 Lj0!%CؗZ`SJ2R7xc "8=0"^iMk#Y9)~]ޕa+>=mO&:o?\7)SjOK?h\ҊD2+!azϦ8yz.hٮ#pH`h۠r{v}J /0I7L_~#ff|Bck 'dᎄ,y7JtpnH_;?vA͎kFzLǚ,Ԅ|Mq+󑦘ԱnB"U\o"Fś0^觱o*X5w 7öY;.:N൬4O4OŮ. yiQWm3֢8DT J۬#lqݛlii2^~J<0F$|Lt;uv֤mrP.`qUPQКm )il'q}; >ynF--p{f Á05ezG`Άkv]/ez9О6pz4a?8p?6R F.4Oɬ诿˚<\QZ *1r" ~K7+*RJj(ĶAɕ?+9?~ZuJ2UcXNÀWO_o?h.nubu\ߓE "v_Zo|tWd@ū8|d: ආUY~5CqtNT?/@0?= ΤWOit%a -[=t2{Zionlonkᷧ$EQ]RKxY.TBPI"S4y$raMs MʘS320s)# ;1 >2CuVPjN:k8Z6PA&F\o$.:"l:bQ-SUψ V=bB+RJtq?Ai]HDP}5хM]*CY"#?q!6sMh1%ЇV9pЧK/QM8 7ABN[B4ef9&r,"6dȰ, Ƹ#aR%јfvt c? a=N'ʕlXWs !ALT͑ 1"M}Rkc)|,4꤁b<.L|J-% G_&P'f:wK~I82VxącVùNBcqX[R̝K!DmYQ x$ )[X1cO4zl5̨ͭ3p:h阉6/~('/#"*`PH4!,QΛD.XUT[9*cFXD|0a%°xj{+%R.fvE ,l#3)`Brs?&7HC͸Z~}I| ,6UTLr~sI~*w&)5*&`!Yhd#xlr ϛ'U()O#7()&KoW12#XP1g&m gėU䒌]ml Uo N^lg +*Bӛ _Ppa8>s0@l"8 "ʈqTQez⥁48$8.h7: !9{D%$It\b%^Fґ0хT:r gŶ_r1jg]luo{1/V2`oθ㒤`K%i $=\H☁ 38`k|@N@Ē"c}^ l g=97CcGʳY]yER,"On42$062!9)%R+˜IAD[P+ |:>$#1S,hx`tֺ gEpU[y!uf]"lqo{QS;#V)aBaB5Hh @)l(}ŃPS`¶׹5rn8TǚFQ;zBgg?_LMEpJ0+?zM\LJJ-ѢƧZaL 5 +qv]+HoR_t+J\Ą1T[^iGjFY@SX*!5Nb&{rZpLW_ۋP,fj)! p쪔eVG K+eʓCŎa++]]^}n~?Z]DaI=iwT>-S۞eJydGp8Ɍ8n!_K~o-(kB[.=z"9CZY0C0)p/2@,/aQBƒE9d!c ҉N"tG?YH #%qQeAʻB EDCq ,*0f]B#6>x೓JgـV4_+~A\ r0|!=RqXtܿ%2w0%sH[]F)2 }VN5yd>Up;oRBSϘ8P=t:FY04‚SLyKgÔᖜkҝ/wqHIVf```eQ|%eIӒ8s+֕d{lHV::w&1axq/鶋U;d{"Œ}*׭1Tj4EnQٳ@(}(^xWxWx l)GJ>&6 e%rԓ[eg%G{"CM=9R6Ϋ|D}jg1KTԠ /'ͮG~'~~ގ= ;J!PgmfUۻśo~3Jr$  š:{ˎ=!sV v6c)PZb,G||ZCL{Xέ6TC"v[SvM#5Dn';O+/x6(Ɨnfp#|uSͻy%5E$rjǽ8yO'L"I ܪ0_s*~|=|{8ムN<&lY! ,EୟADJn̋ ..~^d(]Zݵ,kYkYjqmS͐ 6 @µ`*%(5W_qեj8Q#Z)@v$US@+OL;2m@q]h߾ymK[rrzhes~˖U7.} ŵ.>,gĶlp_Dゑ .5.-Iݑz]Pg y\ZB%H7"f-ٙPI9r%  z롁ˣo޼o|^{cbw~5{rځc)b/;}Df\"=3^.Xw 8ļ讍F.R^JKuZvXW禞|fh6[]LH/n3Mz"J%{ }&>,z~<,fS.(ڜM_Dywu+>>|kR%-b&vROMKallr1p\RŸln4͙cX;~Ņ֓+|-k)4lkq\-!߸L2Bz-<ԹMj,3ud au@0_,h/n.dnU .ktniקѮ ߟ J磽=菳.z8 WW֝r=;lo|-сiX6woW\Om0 uVU4mVzժVܼj,M9xzÓ5\iR^=<{p^<#zԱHۤ;mĿnz~&79fuw9-^ngϘ?å;k ? _k;w];fܓdF1n5Dz;V9 )\uγ1۫ċڟ.e%*N|-,rM!(q14m>]mlccsq_~vzz]chhYGۇ&6ʽVz~~Rtm|ߡUҿB,?'/۽eˏؿ;A'tЊXt//s6W%^l}X .xaр`|*o<3ON>C7CÕg+t+\I仚 ^:LTh1.33Bqo (SsAgro{T:;ߟl4MǿqK=X-/g/,NM;?ֱaiZƲ#D1Ɵ8z+_<5~)olUA2W{sW{mvU;9uk\q;R׭5Y3MG[ys\ޟ-qe:T٧.kLy`rϜ^1gRsj8XIHUV7Vz%dr-^j};x+ɛɣ?>n}=DX|o:&i^(ؚ]W[IŞ`rR!WC$gjF3l(]XsH.kfrBh5mN͹Rɍ%ZTs%Ʃζk짛dZNimhM[hb6V1Jk`$ܬC3j RV{D$])%Ourñi—z)5;򡋱KLJZ$Ͳ6czJI7O}w?3;-K }Ck>.\Lj|a-m̜}zg][hM);cƱ7m (8bG&WƩyycwi-&c. { Gw&>#FcSH^ 0^7I&g;]mZ&Zڋ`2<%SV1gN7¿!:I6qs;`Z]բ 3 m)Eӽd:cK rnuľ8xX[VƖhT8:CWs|R H/FMB95>Cjg*qdxH*apD|mNR@j2'E:_2h 3ı*\ %_e B2apgS572X`0TA 81:i4ώc#j91`Ѯ8.Ԏ]U0(*ZE #XR(4kx*]*jpWO & X.E Ȅ669NA4slƼXK2ҼZ`B]"v|Aޠb+"1d a̷vNc)Zc?k;&r ``.C)i$8()8ؙpB% icʃAΛ:2n= pC BРb %Rae2/ WEje" ܜ*h,9$1 b 9Vc{l-D3F tfA-nP~ncY/f&YD|C1* pjANB! 0qs7:/mVخkV KD="?]\B$aMAƛԁq`,K`tdUI8P@+uXS!yG aX _<~1 YA8902+2ȇ@0#ӥic9`ŁQi/Q|A/;eV"-H#3xpMBՅY,TGwAZ"4v8mP.k$Sƺb|=Wl %"diF%\lZFe} ^=!)!߁]ڠy}im@/f9ІHT&䡻%_!13u51 lcYdے*-hg*jn<ViBW; I >5?|8I߿WR?$P}{KTT ß x}$P;!>yHd'IR L!$BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $~I!N (Cu@lH 'Oa%I[$BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $~I ka>' CAqx6$˘?yJ)@! ճ I $BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $[H1@P`aϝ@>J S'r) ]@_$@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@ t%jn a5͵jp}P;u}/.!KKbqR F?RIxefNƃRgUes ^,\bߝA@az8,𩘎.&!2/nmQ&AyzY#^a4i^'T:ֵ0 QJIޭ{NӬaXR< HZ(ӯÐ7f5Є)Ow+/lԒ%VRwwd{hE/8h( -Fb{T's^{]l^/Ӷ$XΪȱ< &ˌcob͢`@hX|ǻȦ\lE6wµ' ,(ѬNdd7dr3\f/\qS4Ftc?7:jLOgN+3/q왂>ΘB" +|C3ZkF_ϥwBT͜얇} t\4_Jn7 F~ew5֬ F-,AIυ#[eRJNsKTY(w= QQ2E Y?^|0+}ҰKbn*UZѴk'd.Y;]Z}b8YoAWkQ^ ./9U;sE:ԢBk>SЍ:}8Rp1/ӯ9*-tC4l9(dtp6.eUBլbB1nΗk&FUY29 y $ m# F:9sd^^ >sf_P֣YbXp8kPx1.P/_/Xt'`݄ºbcxu{ ;sƮv C Ͷ| oeb6tۗw7eWF8ZP2cLD˔ⱴ3(#9J5YhKqW/=ٺn#٫蕘̮:pTlʽ0DBTXx\(a~JVF4b1qB>ωrBple>\ N‹l6CRZGvhuV-5Љ(o!axځETae#-s=/?7e~ݮ _7xQEATV($"L72+'C-BZ/8)qfp8x;5p+7x[ *h͊][bߢӥCwoBuov@ϹsoD(e) xԇdȑDtPsomQU49ܮ>K_ ng Kfٛ 3 V ݧ^uhgi؛{yv ?fV<Ń^oȕn.Xu}7 ? Gݿb4ܐLo?`bW ZYe;Sje]3JѤΤݙTo6M*ԕc"-cXG5VA[\iٽE WR;֜Zn -xT0Bq<=X̸B G-'eY*"KJc}Uw Б':b#w!r`2|Д *Fku" TIPF4i3Z㩆?rq\kOC0W[7&y.GyincYCuQi,!ya**`#T?i@nAB+ $}ܘdcbau57V70Wj{}^lV<ѡAa,翛߂{@{Í[ep&Y *|qZ;IO?/N=w^a'bSrc^tzL$w񇽳 0q0xIK#-0(Z+.%#*qoM0:7Ǿ$ wX TC05+`]ic> c]ҁQovjpm5˙Up%r?\Qjje e@W+UTdTROI+ LD2$b$8HmN҇k1ULm1U[:?/Xq;(QE AZXbq iFTIMS٬y>嗪[j.?5iW\ODxoOkxކ3׿p=q@ 8yq0J('`LNV7s?z0u:|7~M/^^?7B豜h}gK d:~ |ɛMW'x᧓WoޟPfN^^?rjk읋 W׿"kU֐A|w_z5|.%vӆ䶧Vxd~Dh2xv*p̹^v>R`#yZ^ryނeL%-*nh2;zZUjoM'$LJ! ~@^!e$$lOPiG ,) 8YsuSޙƆXާaE{Iލb/]戜o@ѥn垩Rߠ[6g$վ]Wt[ޯ)|[zhNxj#mK%WwLq[O$)ݩ1ޜ7}n#YC;nŏI\q&kٝ2"6nKR-\QJ*(l4WqUZ-!#v*Ж]pcV8M)s`yK ܦxWJa8P>֊\zC/3fRϗ qcN7?L\YϣMggnջdVWeMHi8Xe`H\Q SzA9W 00J^J;e.QejYJ,1LX+`ϟ1u\E/]!_z./'7^xܣX뽩OrȊ o/ޔp?8— ]_!DzO0?)GuKSg.OMOe $|Iq];>|b|f:'<2!C@1OMꬳ*Y)U֢p21:zI]|]UlqPE"L#r=V=tC7@)ЅƘ.pZyQe0Bß -1kBˈ9.wlPi G ]x.mL2D{H z4&hY{+r^~;M#:f9&r."6dȰ, Ƹ#aRʥ~)$17֜Џ$m2׶聜s߄p:ya嶼 j]15䣣bbVi앗ƽ%OYܛp]6+CdBKK'b-~vgH?%~gDevG*ܾxV\3X;; 4rkH&u$"[5fo[kn/*e]:5v7'6 ;vm{9jk^*]񷹾Zo`iKݷCM;*. oщ-zK:jλ?/6m泀m( ^v}5:T6*6 ! qM ,Ia95qvj:RjKf()&zG6|יl-iڌ޹*[UiK:u&r1F: #A2?9HyK=أXMٍ&nrecdU? 7yP F M3(ćEQ6 rե}k<|;nؑھ.v/="=O!`.KݟAm¶Я%zVmZ6.HL1Dɘ`]pA<`: 7^▱kVq e,!]Znu '1bePzT}MyL&ZGK!zƌƁZ0cĜE L#,8ZI}ffqô95 }!CEI\<[Ͱ-{ZHToO|ZAuVi95K <qI%% h(IE4cznG1;#%L0㕱#.u9o*en'BJ bw8 }NJ/gANpH͈$K ;s`uFK @8`ve! b3 jxT~8!,LI(4;2[)r'j잨ʧTN9KOz8BÃȊ-RFϕ,< }!8 `p"c8[W۬(4GkQx |Gm7 ,3r#cF|\%_R\q(X:, 'H )4%Q>r`^ MOٚ+~& b68DlG"~*ȓ& + @qLHAT;oԊk0fEo1TYQ+ l:D>$#1)DBr,i 0H s:5g7"@pq:pNlZr(.̸;\pqkd+0aBaB5(c ZEh)A{͔v6G)d&px6xK;axx.Al?)u?GkJ#hsWnty9ߋ^}[ l4yXMh҅#޹Rp44YJ _ Yig?^oa=-u3ݴ_MwIEÓOUa_y* BiK4*5AR鄡jW/.yq88AZ+ov c}\#GpPד6k Kan' 1|K) |sdx"xhtZedEG8zZzG-zVkv 8ô' 4K:,AT3*TV O+~yq \O_ceUL˄O[rS09.)tkpYSBK譐b ^<-4|1ZaΖ'P_nЇi'@=&E1ŮNWkT}ѯ=bA+^O!f1'4:-ٵTQ /Ir+Zw&=FVSjգ|Kx#ݵLɷX+A*teo K9W:h4+p@MWޓ7{ QU,wGo6TCΎk$wk{al&ɾJjn^t>Eײw({ZK;>5|)@OՠѶYq$ڦlMq:2F'9 'nSE&Nx='Z*݆/?utxmsc:ۂ6DK*H#cA7k{,@;QH@Ȋǂ|Tomʃ"LL AC!^Vnh 0k'qj~T#M9t^y^#_t+%E`˃LчwfT!АP*00࡝&r.`9V&A/Yԯth$]`vFP|$H0-H(~gW-aRa=CWjS{z=tІ2_ӷՄi7#2<]4I 7/T*&4CQo?m|_<Vфj?-j MP5N]ا_5|jV'J|1ſ̴ټhj[Ї_!Y 0H(򥖜N)bxЄ~%Hr!q.Vk1RwR>T"Kr;#6 Si44%ܕJ8^b$IGvӅ|jrճtiw+.e?w$AftpeJA+m(Al`!Rjϰo!#;La0w.β˃Aٜ0%tj j/w5Ю憩H74@λZtjOg5RꮫSaHrui+5VnVéō"Rʘ^/|mw;=Fs5n0z`Ҥ%o'R/q?P,]XJKמШp0"XLd*(H$*bE HZ瑱lG~jInsnmܘiäQ)@g d'fFƽ.dXXKEs-5e~eg_RYE&Սr~YdHh z#&ֻcvܮݘ_+mڂ"lAׯ%7ŹoA1v[P%j?mZ|88hRSHesb/2%;ɞ^KnNh\P؅&/9`W0$#6?_ܺjٯ,NАKDtA}IN9\KO9*Ra%ZK5 EқT2||?2_9?ګ6}8%TΩÖ.EHU&dv@*Y af]mMR8f4n=k7BW׵Sol^;s 9Hf!lDSCKv/R$UA;\Fڨ`z0]Ϻ~(G(%&/BIYkAE)EqbyQ^ s'ØU7ix M.b|m gMPQ=YRD>Ikjɸg2pW)RPDHAZ fI1QSmԭHjhIQkڠccj,ZY;߬5>E5[a:IMUB-pRC&gֿ̘ lv)J3dE D$/>-&^߼㱖[#uLJwa׾?(hR!)RTd1|BHbUVSi-cd=;_P[>:D! x](yfѾ7uuuA>e|,Uwڲ:&DyB )]B=>Oչx|5AҋqPڼuNTL`E1\22o?C^G[%c(mw)D쐤_f\9Ư V Ҳ1k#U>).eT5#Q(xKim+UrxE1"zҾu:Q<CD OQd}Y d.FL>y(ʨԳRȦB@S͗Rly˄@4>ZE=flEJ9''db{'CuqLOHԐIX<ĪuɕJ\,^4\8Bh#4f,q Y'SIchJƬC7,$?PVC֗] (Cܕ&ayX?n`sE ] * *U+0*eЄWFsL@)EB= AATZ}d HSo9e:.|-cngUf2j2i vEm%rx%Y z3xW9kP+1h a6) % ED&PM46#]3cY&rS 5%QAGE΢I nB+?% 1WM5>̚`(ƍAN:M::Y{2h*ܙ@hq(H_0`PfRQ# 0}RQ&z]Vj}OJ2"f_\_-id^!EѯU5IAQh8PZMi,j"b $yh*PGwXjq @~nk΋ q)}Ŭ椁:c$lb"whtD5!"%T}Y;s&>MBnwiW2nyͻP ئLƫC G] XOW%)hJ Qr2Pz7z/3 q=%4D iŠ,}ڄL⸱c}B%H<ܗIZ[x3 nxm}G"Y ԏD?AՈXEYQ 6Yr6e5@D*݇y?d)(K*!+p-F6F~t)=hAjx1) 7D>G.c `5Bm5f HcJIM9z"0Zt&!:v, `赈(;$1cMm!I1%4K&VAa:(@/"}Q8$}]qcmBg"0SHE_(M.Gdzբc?ThU_}07[E0%V$N^yk +(|4JNae 5f@hZI? 0"QrR5]@=u@PRpqLr7׈ .zu,JYVf˥!v`SFwL£j.8 $h/]XzǕL݁ˠvO]g3%8 N'Հn^ܴX6[7{|` fp`BC! Ha൹-?>}v`j}V"ao_ޛ7C|?@E7 g(EkMś- Rn0!}}1N^/+tZ]-NOȳ|KVYYeiu|8z=k'۽G뭛 s{*n-]p{N{<ԏAA~T3#U6Ҁ;(tb@/J X +E%uG+V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+J /I Q\o_)`4VjJoQ d+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@߮KRECr@0ׇgWJ#X *&JV@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+Y%z {AJn{9N X+U@:*Y J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@v@n7K6}ipsvCp߹ֲ?ɋ"\5ni]?Gբ|l^gi_vk+}^^.C ub]XP4<8ȡor@3h^4wmCjwgc[-EO`u~޿"ݮG*}>þKzi<@a=֣ѻcZ] oY;.S}x>.6FlGU$Urޥh}ػD7~asz\yj6ŰCG:QnnL Ah\tqC}s]|q YvqqsY_a]^|7Z>qzu={5^Xcz4e.ݬ7l{)r}@?!J9߹{dj6଺\V!ʶlޗ|ϞU)͢% ج(JMz Qy&)WNs8Y<j˧#؋ͯM<3W҅WRWR.9롿fW{~g;4q?ur>8 ,S`VG}+Gqʄ1+1?O7?]wn(bh^j0&9fm]%U(2sd+eDֱo.QCfUkw.r]i"9)B8!̜#6X|u}{a< i6NnCώqG}|wb?/.}|Lne=kAbuvMic^EtĪ4UUwi۷!ŤK S|DŽK s.οUĨ~_m-_.nx;| y{ ?tx5[r:2-F4ݐ*(H$*bE HZ瑱lG hb:~t0|59q4UgHqoC1Kɿ`ub4Te&iXu<&)[5]Nv|0U濯n\+)7ȓR#"v\0&+b"9F"r0C)Sw`*A%x~ƺlu*a4՛P.!zr5dI<9=I\&D>x:O0$|=ue13"iW VvZ.g J0,)=N ɗ6_^q[H\As<.=S{wܞ܌kbq~r!,+b$GgR}+"`_33Y.8LaIeڙ_?y4&(i4чɴQiN>/NY y &/O޾~_`pK~0 ??55?_pUanjSKf^{}--Z(=.~b1>|9=C{vͦDHAfS3+@6y`T껛T%U*D ?>6tM %vP,-rc J6챃l~ 52KA+à z#(wR:~:} +/۬g2BZZ8k-N ʘny8wf f]vHZG'[PTN=rMzEIy̋3䧫J"HSd,CE `U0d6TcA6h)HQn%a+8!rƃb(gD62eq[ã R∪:kYC^C6rV kadz+@l8mnh;`(%lx發/Ӂ| *pS!˸]tǜRmY.JZIT1ޤz>8+AQ<W s|)&MHgfی%q6L| /{|}i*hhs>}o}kb)V[cs}xT:fx O:.~N oF[GQ? ucl zWG\&{doNUo~1b#<|LMқs;~7]Kٿ_.-8U^폿M#ԙq.hTI~Gy:/V(֤fm |ܢҏZ ?uJl]"zxkR2mZ\1lX+ÀG>h.db5긹'E/J"AW+:}c_'+^u6ٛ"1].#zk>h~Q~OUGHg&Ow)EN'~;v:փƲ`s$Zϣ#XP]9K3-j -0r!c2 Cv`(9AI19#RiLf}amwqI1F;j6,?F5Ј_FL[8P& ,坷DjŵS3ɢ7ؘb6W #$$8pF"HLzz "ZA94j$.wslܭ?xbex3If8ōzrJ5a  :^};E~Te R y7Y?H&"n<ތ~={)~dN'#W-뚹ޕ`aQ~V0#*)dFRjmw19D@̐6tki]ڳiIZn;% [c!fH Z{ޱԁkn1Xe{} X_I:1r9 rVPhuR,ྴS2>P(ւݛ;~߫oͯwIKWn?ՐǾg{*IwA}v4[$=u ',X(@,dKk,)Q48Cb=bx-DJxϤ0R'U6D5PD4X: ,@iR3CsZhw }:m6t'6DGR$A5Zce,:uy߉ hR0w0%sH[]F)2 }Vی}My9Lǵ rMJUHx3jg>LLj9FXp4͔t>Sچ{JI׾x} &0&Cf^^/vVZ[{"ލF]e;a%i'=\zx^)DB~xEG"\tUtD!/EGv?;G;y0E*gX+5Vi# 乔QT*%!oCQ{0r` #52(fT g{8WPP}z0N66=Y䏍!Q-p(br(DKnðəawOwU#dBlm8=9g>GrIѹzCq`9M壄: Jpю)&.nL-盵o&[q=pͻǸ o UrUdU쿾n~5wQw񏯫h:W/*Jԫ|zUVਢva~[΅vY>Wk'dgy]0:nkYxݢt\xbÀmYșg?njop8a6|۞.og}^22m\7>yq6-J2> H[jc2,۲mŭW m*s͆MrqpTӄ+2 Usm+3JsBԏZ5G:2ҦTOgM7fxT_p!l,7SWO₮aзٖyE2X=vU aE<eWk^CˊٕP-NB'ȄɁk頙 2{LZb(p^27B( x{b5xd6 e{rGGsĎsE\1!"lKEm^m%} ݛv#4)1̞IEжnoa.YjVj[VrpBiוTRWڮ mma̗'U?Ғa놈ӓZ:Y0:Fr׉, %N/;-He?&=\iWYy6'{<\A[?7TZ޾)}sZ6 Mo≦S q L)dxڈb (4 m2&!CIƣQsE2*z^s ,)7q nZⅺl[AwG8j u3td^"bs2.HV@Ok櫖04V.pnFV?akJ7;纲u__=q> MLn8=cAGBp_E>:O'Vŵje Dĝr8tժͫ1GNpyl~s F`yI@.-L%ʬbN0p5@sOFd;O'O ?C%-][ӞK,%{?߰:F.:DL)e@I&!Jgbt|$y) &T!?J+IPx t]jy2:T=.@ Dw5"H.B]JOPuOҿ[O@8ԧH ( 7 c -ns*裊nNHM. HexetG,2z d} g Զ˟q{>>OK~z8 (n6e9<B3=&0 b ^}$4ձ pZ[%\0ԸM{ k6,HT 6`&2))e.+a^lz*c"šE(F 8K@(4#2tޛ&V>%T~з`4ćj|^{A0p2%RMhxΙLȥ!fǐӌ*aI0`/ 詔&nU<{;#?:?cG  ]GRr^?(W^Of{к Iz׆(S pA! 8?8nq/;-ˆXoVK%>(CWDH\D"n$1Km]"XLIr:"=b2 B{@ elzdJ292[zUOO&΋=|C9KjްÿÕ%eA\)IDL>AkRF4 "z)"^e%zL7oH)t5ٯ?rWqŞqFo.ҥTB !糣Q(N%D_'9j)` s,Dw|&Voo9^,En,pJqK)~ދknJ z$H)wH D'Ic꣑@TǴN1㲉׎z[ ORmdudTX8?Ys PC *jJ$# M=\.|&ƹaNݺva7% 6',B3rOc&r܊,0bZSM-R(c Ae u-c>ɷgu&3cX۟(Pk<,꺝Rt]-Qχr /ӪqWk$^Tb%KpN;surwy1=6Y. f~E%BU !UN9[ı\LFfg-~Y\jEZX+w]^-j^l,vPu&gmEu#|b0A )2jgZ[6l韯թTG+Ѳp'E5Pl{ Y"Y@h TZbk/E ̣:5&RU_t#W+˽f9pVTgΥN+K /+{*Qjo"]-8;o 8E=+VVկ5+ˋ4D9t2*9CҌGaYJL@}b:HiyƲIEVVa2)J4w.%Dsh] h]oH.--%ʡDsh]K4w.%Dsh@цZ]SK4w.%DsHuHeTDsh]K4wq=(%Dsh]K4w~hݕm: ВמJ"%L EZ *j͈4NkA *y&A%!yVE0eE$Eܚh$</s V'ǥp$GzDAЫ A&76JKP7~.Mђ(z gK{E]Xn0Ը/lbmun}um9﫛Ϸnn nXކǸF2rhMFZΑ(Z#%dDr'7CH:^X)}jVV< CP6T82X)@q9@ uv$TXw*09f݆O9ށ4&G"Fe5Q8kE̪HR9 ? jkچrR+UT0^K*)jX2̫D2u1H2m({[sNAL-#)՞TS?L~_Y\%ڤAVx8 S -jsؘ-"041`(I]8efh\o XɥڞڼY^_q"Vljr1l",''lpDZqNy9 F^7HY)XO] p#VyOqBaO3dޮf秩Z?C8\+H3lD{{VñA)kWy"f1p j)GչwZ>]-fFښ~]yxny:8b0 f>'i4ꀎGrYAlߥZMg!XnthC i⟋Ɔ* !9ӛ^~۫~xOo^}Bv8{!z|'tmFlѵ}r·W}}* 0֖8[|I~؄|yڥThOٮZ0,'OpEZaSeqqTJʳ/c+B bMv6i!V?[HwqNwpN2)UtH^,ڃSh$ԅ^:5W<>TPFMkgWO"&njЪ)Z;.,P" "}kugGH@ĥS J4A+(5J-Qp*iZv'*g},ȓ%|*v_9 =ͻ7|Z|U?ﰪw R{$`/ &G=*~$;AH[xRуՊ%^yH{d\C 4AG*y" $qM#I+N.!`ooM%1HCVW=3CRrK1litU_O=d2p.M2I̠J(svLCZ&r-zB~Dd7?mZہ-6wǏHg]\p\'N!iUJ$'ͽtA !z+J1<$gY)΢B"?*ӨCR[" |x:K ָK1`l e8^?KDX[ꂐLx…ԉ93ښĴpJ1fbZp9ϯ^:+(2&Խ@Qif7q\]Bx 7/}ff!YŞ;:di7.Gdo3XU9Y@J׹iVMfNw$ݡ?U:p(cD5wEM\\tf1GZ'ӃCu;c4gx1(yTG*VuOeaW[60+}t av_jW6On{sXAY}>+6^I!T߿>VY~hp'ඎKw&U$U UmI?, M@4Ybr~vC2- p~b, mys^֑{GoSo6ௗ氷9mv;P~bZNotR S|>E/Y9;=d9o3T??O+vn(rwCnn{/̆>J͖ By@~<6ȶb9q^f\4xwԷѐ7,S  ;ydzywD=S;65[t55[ Y$?n2[ PjƆy–ΰMMyt4mFObÅV`ŽN-+> Qbcܥy.͗iw[0C yWs. PNcN!)5~HX̦y_Ahʗ:;KL~sxxK`Gj ŠHkiŎs#Cy$xpFH" $Z3QB".)GSZ1X{|<1lܽwеN}|yҧryt۱f{',@(P%ȠIT\O"H)(8~LTsa# Ձ%xKerƁUR 0Jlrh[̤ ^q@J@a .&v g2 e^lkdh^12gbF+A3)~=sUHI=#s\cqd#sS|AWΐh HɓMR!J E aEG8#*;Hs-;ňQ 1Pcpr<8g2!@&K!!(`%XTaYLcg_k-m@Aaܝ]l!pCȍuaY 24( Q Tȭu0"'GVֱeZ<2ouv{XK?w4v֡-zcծS멹U]k\ 2z:ruh6rPCYqnz(a3o8q9,h;wJhH9wj !3[ )x&.^nY(ݻƝiJHz"%(rC'cyx.jwELg3 :DθA|4E3q}Wzc}s/3\b56Ƈ8./9JmAU]6rk7G=fY2`lvˆ?!нWίӷ|PP:~s <ci=Ң2hHkX79[0qZLg Z1!qj Gu{f 秽xȅ>'YКy sԸ8aYA 98L0-[i};[x}T(-oJ[5/<aHhf> &|zCx2tB$/;u\" Ea$:RjቭtNL2s: k*sK6DD}ҚVx, BDNJ KZy 6Sm30 ^d~:k}?okegZ;#{&'}}!tI7ziJK22ZZp4i*}. 0 Y"`@))m.PS\ߋY|)2 r4v0sO^0~ԋ_B3-cHQ!$x+AƩ$ v;aQ{A:dE5 -#V9PQ&SC݃5a:N @ĦQT[KƩh9ႁ0DSXЀyh$>Z([q/C&{1βJ{Xl 161a*צI)q[l7+s_vPm [m;ZnJ# %dL#Ղ$Kr(Q5w$mI* I2dDT HքHb䎡B^tbl-_Ɇq_,b@W'*fZh_EiNJxRǴ5zC" ̥R]PBȍڤ2Ђlg\pވ"ȤpuyQ"gEpQ;*>u%]]KQKA"@(CNYQԃhzQINFvqoaٱ/=> +.x ҦlrQ",+*F\̧Ӹ_Qn`uBHZ TMpM+ IPN(/'AuHȾ8 >Pa|fr ,IkOM@i+34cH#$AjAhÝ1nxlA'o \[JR"t%Y0b`tR9\1߸sIAʞ%O%tTFwCR&厠Tr>M/?2RDyqż(T^}8աgݟׯlu篪lXVUpǿz߫0ig͟gU>3JzuW_W?ʒzU:ZɎig_\YXUu"_g=x^hk/Xڞ~o{p8`2+^/{{}Aqs>~TX;R8-? H j!d%fB=jS\u`D ~1O0 UC'4<wU.~8h{Ya00kr i/"xΧU_'΍٩曛Tàxx{;Cv =Z@p.:nvW aGw>!=0S&+Aa'AKgX&4TL@S78hHyNOz$hq3'Ķk-Zpl߶lv6Ks8YbbɖvwÝd~5W9 ֠p~aRMih1o3?MLV}[5'v^]>n*֥.qXԔ#s=rHpVwbX/O!?*\+wVgyaZ^[vѪ&SMJ t7$|lhLwYb\G^iA(@5n9Ć|Ժ]T<%g gdQkQHF# x,"e3XZ))%. 9D\bC IӉQHH> ]Xo_?B[Hgz qMd7hS$MRoH#Ӝ*#u8"`)E]9`l*mDswGVW>U*NK`8^tkY> 㢦{4px\N%vslўБm%骯{&ª@9@riҋXEeph 嶖SVnղ(U#g >}nmU~Ť3AJsH<&څhe3(3Y΢DzM::|oZ 1뵴-{ ;]4Gjbq#R`a O1N6/3:^Ir>}GsM5adhiHR:'GE]Pl} '[2˷Mu8ԗgh"K 5ЈQ HekQ Re&P6Z.P፧:4q5ῴJ#ABjjTwn*ۓ-gFxys@*2^Jۥx{8L_ALWWsfsoPgXQkI\8xd<y%aE_e,|AeX!ʂ&gR_?~orD3X)ҋ%9q8ן`^gx:T<У}RO~ɳ'7Ag5W~BWdLjMqKR1% 9KBE@9MAxETDLd Ζp8iRhr% (=r s%/}f)&p.! jy Ӹ1H4+DJRZ~sۢ)|bܱ/8cl0@ۨeT&0]>rZ)G".o!F;>)u!aC6H;wշF6*|=!jNj-yg\!A_*FqR3KFp'CؚSe0qqwgq<^0a 3Vֻb\mRV~Z\|p X]1 zHݣBhDhZ~&^˵P+$ FT8͕N@ؐ' !FU0͔sɳ}ٮ|L(RK195x,ڄ(#Pcm+o CM;k_lrYn vXZBK~+mkɾ-ٚ}SsRN/!{^LF^nTq+DBaךY 㶥1Wtj-~(˱KiH女<M2V9Mg}fN,;)Jb7 ?t>~g?imRw/;1׽|q澯nGǀOx/z<ՋJ-lK@jVB=A*tPQH&-+=ۏERW/ '8Ŵ.6WXLIAE)҅m2+Zf2+Lh  Q#0xJ)P}hAXTd6؟G|G4K<wz'.)^lr{jVfqFA ?(u`żӉϟ=N6 NOedS\ucؼT8uĩ \>?F>|+uG>AurE _ͺrwǯ^˫/__z}L>~׿]8s.{w{u@g<_/ţuSM ъ69zo\<׍o4h~Sw\UޜT(O|׬ |d_nr>-^|WRBB4bH NN2*)u#pz87<:ƾK1Dڊ#@2Y5 ~~dǧ9mݏmux:@z`T OuQFvK<(/X`5vj/#9SL&<\ e!#AͣP.YQuDPCTO1uèQa{>YeF\%JKdWlI22ԣ*y5Wrs<*UaNI+s^MZPC2)V#Hk\˄J~+A1̉1dnvr{شShU9 q}2R_9  .SJH2hdJlN:MZ\5U b؝bPxD+\2uV$MQ u<(RxiӐ6÷Z ZwZȯbHޟ,NonoSG[t7\{ƒSŐZ[ULr/A"DpE *f`49 YzZ4#X&h)! BBrj>b ]yK1 ɥl 89{pY(:*7.P9+wR2zJ}cR|AehFfo? ]ש[-:[7dCx %ntxG[PUWw Bd_y(ZILV8 KI1s2ҤR9J: 0 K\?psx[/Ӟ{5 c?UAoF6mmTt믝񿱟A]}륨ܫBh˘w1JsE5?_'2Yй1yVЮg5dvFo;/BȁPZU|/labmY PWu7;;sEUvyou9ݱc.YC/z݆yU(clkIry^ݸӝFM>佌S6Me#OmNԝb`8̔^~2]yݼ nl[r|༚֑vuzu5zK{#l氽kh13R5q2*~:u^^r>#dy?ڞ:>>;Mq7ͣ B\͆mvH. Qrf`yL5-ʦ$g鮪zW91!1t%u*@4: \p# )B+\N%,LPDc0qgǶK`ˋ>rA<\rw~:cMgmk4bǯҷYd]d^e$!]pK6)/kϡՎ;n1ELa*EN,{+ve{四kjM[s<򷸽a-_CiWCiCW4<.ݢ-Ǫ=N3ќ>uG-\7Ng :~CO۞?7מLNxHC|Y&q=cZ?&/b.QgJ)CC*uMPjm}ԃXg>Ń4&.VRsmpG <H8L<Ȉg*Ab>LQ( ROMt;;[6FNo\`ⰹʒX$0i{+H߽/mx:}.(}(rL]=۟Kɝ Ac69x6pꁅvXp#'BZVY=5s;9,ֵ0qV m X $A,A,C,ȣ 86Gx|2lJ娴oө9aQ9 Er"mYhۣ/j`=: i0Jhkc&ɤx2N̵gh齳26Sm0-.xM ˹~rz{lMm~Wk~yFlZw8@/gQ@⭴:J*ZL-Ja@+ =:]5`vx$ A0iU0ZR7`8gv}2!@HJewdRR9`h/ů$ffVrI!u,H G\3A;-0Q9{TUEEƯ̆ i@uv^k^^%yDHIet 2++jVT<7l ]?HYKN{:a?=YfTʫmNEZhD 9b<dV  aT2cD-L;Le4rxR\6eDL.gFZA&#U2VjXT$cS[*B[/3|_|kt]dHyx70> N/b#Efg#]0%J@N:prD@IY"=' C!{!)ٔFCX%+|s> Qy#fxR;-v ]M:6ھ{G$X!*QXbVlxkE-Vie@ j6bp+bȄ "(IȢ(Jj FR։^:P;S=)qW,b5"NUV;[į"B,  &`D*N8WVL\R۪;Θ1j%ϦD.v&BV$I`9lE֝[di&]&%EQ.]ťY`gPq!=l3 zʑO^9ֳ2S6Zұ+pr[0aE=c/*m2'y?;^?eZb4E;ih-u69y *&ܲ$(Ƅns/e'A-'A]qܰ/=Vr;n% xUAR!XBpX#dI[JƤ9xºe@q{ j>%cG2Z$SΚa:)Ar.5ӎ({&8u, P%>8X\#h˂0B!zMP_J8OdmYS"{۴=-'X͋oiuhѴ.a8:~7Mw>l^5ImgkoҨ /\O\/yqNk^Eɋ͋#&w1|48s ^H|u'QHK},9\ɽpF/ÛE+wJдWD?R78,z*5x&1;ozb6<~SzJn^#fz ~tׄ%{:8"&m/_%S8R0> gEC7jR?fH?B4Okӝ޾쥦{s3cQ{?SoJ܍QâW^6?>͔a|BFtQĵJ#duF,lEK2cZutG9FrLe1n=86\CoC1 G6faY+qA5r$?KKnr"Ȝ!YZMQiQ& ',%,ΰ Qj's@BVfMQpMB2MAR' A]P%ǗF: ʭZz0trĩ$t*ɔg?W`ϖ7X* !R8DMLfK)d`rƤ'[BN `[5;սl_FUկ4X_t~|A=rvYgr}Zᕋ=:N($sӹJ ߻wp6 u :DrfES=Bvr+Lרhu[_>^o7]?qyTB 1JU:+̒LK%LJ 8%f"sONżhf?-1۵bNdi9\ĻH58yG 2mF@&TA󒈤`&d4).Ր-3o4 Yh(Bԯ[2^'Q?7Z}585c*ZЪy}0*ؗy];H]dި/wLQdF&!g@q$HN(3RT32[җIuC3+IQxH\GCO ȲP` vuv ܾ<;F<y @SSE%Ok}Ncw9x>90c:bx&^kUyhF`GψNOӸ> ̷ Ia9] |Co{'\Ɣ M˳tF^V0KT|<~ƃH=a<,rMŞc$pK|?/ M{yf:F'ԁ kO%{%Dd*0H 195'R3|3|'A`(10<.ba\xσ?x޸}w+aC{ ?{kG_?aoCi(ơd<]w ccԉY,7t J,L:/`Q'xz^?*什&{S55Ʃ}LSS{kӓszwoසp0Ӣ|Х<н-O'tw(gZwf^ɚ`3)3iX=rl+hwype)'WJ.i)U(gNZ cܒ t˭ĔO^99+&E@\:(E!J\;x&k׼2Ô%sB=iŶFB;[=ko.Y x~'xNNHu!mb3n"`VQND גҋ@ߕWN:gc\ERWR\ ۰/_EK gJ Z!;b Z*bfBfd"GPLy6xfoSw)O^|r@98 GZi nso "DLNYp*ICL Ʊ$X KW* ])PM&6*pLj~EA)$~ / 8'.9 YCN8 `!HzUL g9M/ i 3g:{ާ._6xlwzAM$8CԒ 5 %#"6B@֖3y0o}&T~~H5-v-v~;7Oc($f9ާ &cٸCDG򹖜N)bxЄ ;|RR^qqCizXȧ@`w22Tns.mfƁxKnC6&0.ݎ }(GYmRw7mG_oA澵P: ~x%='.} Z8u+rP̙,+0ς9LO-O6U4(eV1"5DN0>r+zm[ϔ<}'q{U )MZGv>7W&C*eg\>^3RnRS2Bס` \O(Ê+,=B~a:^G0 ţ^uyzwbO6wfaZSknwmﭞ_y5_w'=$㯽nr]h,Dƞu{:nl Xzi,iOaQ!¸b٪6OdkUy_C'/a 6C2gsa1&qe4A9lRc=<'_ \H(Kb!ns@ $) rfK$zu1 ap)1)!HJ;f2Rʃp-Z^jFbm:] :0^\Wk"2^xՏ!ۇءGnj%:\.NF$*B-QFהaTFNTBj[T|@m7-k R")9IE@)Q* *͸Jb,#FbrwǒRTr$Z|f`p'ڧ RgMh01y/3l)Kah @OBj@u5uaDhTpB:.(+9B@"VĀu]{40Rtp-cpRsnC6 ?-<.{G)gj4Hr\r%&Yg}b IbKR ̸q?ri齽5=)QN c2 3v9F"r4CGgRP;2E.樘8 cϚ@0!CZX.*lf,o`CFONɼ(N3Mu{>d3S z`gm~+%PrQcRp l߶46/vpyTʖ$[[S:'7#Z~)ZUζL,d.$GgRGǫyL;oBG֑>E0} s0чɴQiM>o.c.F]SuTv,mԶb*2EzIs 輪eɫW̒G>ٴzqKN9ѩl4gn4;r?;}7^oN_~w:}W޼[ES C; kuZ>W]oxqYKnS*/֒(=~r.4x۟iykV%Czd&>W`rVoJp܅/E~@_?{F:9- }%ow/|v{O^}%A+à zhGJQ&_7~fFǠ<'T%2΀2*CAȨb WvKljߛm2uG s<TNx$:b^CZPi`ŹH|^p5T/UhӯqQ|1E/jkCՓD\iPWWѸdi`Ź:3I8~M(vL'GC% yuAgaCi1 ϋ84 )݅Gl|u6[@7:?A SOi:]]3ﰂכ+H/>MˢD" K)m,}K |GVG֓k$Cym`XJo˛;5QTYml6OP;IQc {۬ޘg +V2({y1Tec}BiCON2b8=5Y8Yj:Rjsfȵ1O%Lc1y2Ndqq,N78Ŝ9s˭v:F S|߃N-TZk) Mi딻\ s 4RÁZ޳/jvEC"D#U9#6$ͤ4騴|~26}(s$(+;Ï&D(i |2FAʪOj_s2P615 ټikTdޫ}kӇIm~ls! b-eHVb֭!c5 AKA"r+, HD5a O $NziD62eq[ãR܁FXo ]w^~#P&λBw{+e؏>jR8^?Y;2%sJUf*i9'I}pӏ^ zZE y>7u"DIx9` ?0 Qj8ZD[ZA#yn8FyA/Xr|\8 dIFuv3NCq*dVB(N \?C ŋgYU8$ehߤmٍ 6@'ٓԇO \7IpS pCЖE%qϝD2 $WZs<$Ɏ/-Z +[%i#@Q`d^)؞D;&H؋7KNjb/uz7V>V^G3:"Mڟ'`PWF&h2ȘeS)pMBKO_oYr|g7?ȹev9g:2Oާ2JQs\Tf%OVA~5O/?Bਥ_Ivx{%xHd<2j:o|R@ؗcL<(cdL̅U"}g&J KVu_GD@YZ@eצh&R/ܤQ3 ΤѼq(,K-[ەk/Y3o %u*rK^I|_vIݬ%$d)RM@N59ia~.`IS7`'4VU'`sv\G8AÌMB~^_ٿz:Euy&iⷣxN.7w#%wcȾBY^dvA .I"?h$H"r'@#%b"*I.Lʛ%&M4m%C);aj)pB8q1p1:h a \+Pk=Hbm e u10`Wֻ'ۺ'H% i瑷KA~.Vhu{7ٴu}Qv9G0۔֭BnWDfԝR(mn{[ߥJJWSW*kw>)-`?jjvŞj.{.^+m`gM4O. lP"Lz-:6=˪Ȥ.]csӱSŝi_6Mtӽ޹z/.Їȇ좍5H"gL }tɠ0+b> $|Iq罼-W'F_gO=Dm}yC!$v(PcS|d:묠 ؜"upJ(:m:(L,A"x?+h^PgZ=AP_˪O冡n lO8{8`.(/tG Kw>h% zx4^Р}&>ZGyBB($<3cm.RFXp4Vaoܦ-=6Ҵ!O]p}&ڥ+2P*LGYmRw7zҢo޾th}Ғ֞rk.U~[ᅰdE# \jT$ 0x+xSa"-Dek*É)J' C1YϢhRNCTTqG.a* g{d,Ubk,4cpIt󋌌hw쮻p>_WAwGl#iOҤ ,'\0),jrh$SE#6~Yx2s=,MtPHEFcU)@"ApGl?y(QummkԮvcmm0D|)0,V b4o)d Q5HڀI*7 dȈ @(b@bLPc_ņ=~PDbcDw׈DDiv rPDž**GT 3mP*H1[5rcSTZ㌏> p0! FÝ>*]l8#[1.v !u]"+i5.nHNDM.,2q ?c #t`%~6Zal^V>Rq(pǂ{a[3M?Gn] |oX2K W_)X {<3 Ѩf@~WۆԛW]oA; @ :(xk"&DnI's*s)#sYcv߁֥, ӃӜL&ۊJ"-H!e<[!^b8_ @3#"γ$ Ⅽ9ke{'._,ƳYy%Ӛڨp R$%H2<eKcq))\H>WϷIPDRBHZcL-4Ǵ҄Ub g(tH vLY6yѬZ(MJ[bÓ`>a)UAAw>[9rx C xՎ'm->o=K'qŊw~$l@B+ۛoVTq[ym3[[1^j'J'1D|8OIL=֦ 'ن59}}iZv`yRh]}١Gӎ.駮k;wdo?5d<=o>4pcn#8_Krƣq?H& /7k2͛ztҭ( "X7"L={&p݉]Gf^~O7g;y2Z44击M0cp *;Yq>y`VxмkD嶰 p5w)HӝqȬ!_zu>>pi73v<ߘ.M _ f:_e/./ w67C3X8eX)`۬R|]8[2NrY& TL3?>F`ˤ%6%q>3)$΀\k!hfyϟH]_)U r͍^5b>Gv^?,W|Q^=A^p bd. fo$Y_An˯d63Ҫ?5ݗj49>zUR{_upw^s9/p)qfoWV> F4-{%=YFd;]nS̡5%ϫ6*Fq>ʈZ{E;ukzޞY+52@q~hpv@>6EwgW}.C[yp6RH|ČsJE. oC+TBa!Rk1DkRTRs! Hj;aU@"6h3S)Ab,AWK2E )[}_W#r&efC~|UGw7Bkྨ ^Wl+:z%td_fax~=zDZ{.[k5*ȖwRphꀽբ~% SvY;dؗT_+<*% i!ĩ\Q[.JYŜ` j"QO.ܱnR?X9bbmi3Vlv\(GuhR<&<2b9I&!Jgb ;~8qs>p Bs*zm0Ca4#f8YO[-}f6|/j r޸'|ڻAT.0A4q,+[R.v&cs7;'9?VNoZ1R 5޿5^2rkV>1e ǭV'i"٨ *-l4IV'F͍/lfY&o;N >{w?=ǷzS}x/.WJh-8%Ȯ>yuڞ>`lϿGoW{gXVEVUdUEVUdUEVUdUEVUdUEVUdUEVUdUEVUdUEVUdUEVUdUEVUTEVUdUEVUdU_ȪDMaUUYUUYUUYUUYёьȪȪȪȪȪȪ7Aa ~xkkRk&kkx ^VH^~݅/8UV7y wgZƅd͆ݠo3 f#ZR&*k.7=",l:X~Y3ֺh%Rs(62|2# IHp<",csoϯ<'/y1S+V3+6ghxPrЋ`hpܟj38B#9 bb1>L} ;ͨ^q/&+OaP&n0 ` &m"Mе |^ 'A7J^tqEq릜{U!˦lr^M!Y͕D$o&etTe&DZ륈Tpf@Ew޾J =llॄP,m7ofRāYU]3Y="M_yug !G1ElO9NAp - ^de}1bμ=y!feVudCmS]_ZbWYٟVB MI88 $\*Eg+3̴2eI!hf$O6IeeHr\ <ጨ ٤EAmJ&fZ !V'S.yg+* j t^{H:^kJ\_"~r|fqkkPqBF_'_y[+#+lr0^s - ,bH_pHII-ǐ%驪Uu=2Dn]7p'gO\%mu֩r zW:sm:Kk4Hx?w9tfwxլ;\>4ڽm{w ɺp0L=_=wyYUBXP|Nk`cZ-VeӮU ]yP].knn8%ZK9XT ը#0%Ë,x27E`mTRo,5`H`,U}ygHn076Dլ1e@-bQP|]r+ce6=kv?_\Uʝպˋ멛2݅KǗ$=ed6hn]!`?l{7"0%68 $H2h˾5c.Wwub*SPl60!3**͐P].Em}dwO0%^o" S[P$jiLb9t0Y]cEƁNΆA 9_yo_:ze_:CSwow\^zMq܈CkZ2(y-) 0֚87qpKe(2^욋&8+8mdZ[Ǐ颞ՏRR^|4" h ׇz_tD;Dc}>9l3ׯ~j%&0QP (ц*w{90D)Ez#K ]um${kPIRTs,FiזG'A>$M=1TcE^&~.[F=k7G~'h?RxAU4qS VO)UhB ԻKL[1ЉK DiVBQj ZI TD~ OU>eν7r_"0?FC= llTrD8MhBkN'ݨmEqqkc.cXs1d1(Wxs>1W]27 a: aj w{֩{{F̻"]ZeNʶ~6Vh[t{;ٴuNmܯSb[$T8>7T;5:vt IBpVRhW>ݰts}ҵu/ǣ68/oE:]oqm cw[RXkA_o{_V=5|)yjT  4:PBg?{SK/$ 2pP@}"B*'Y.OāSVOLq/) y#c,p`n|7r!V|~Ҥr\~=lpgi5*B7prnd_&9򰑆Ȓ;Sߌe2yU`)U: gg'tI( &* sp8t2lܫkOzq[:Ţ{^k%5KQP-kˇxK< `S)hY +yVsJUu1LN10!zrRY@3ے'*8FBZ}6?w*oĴ}wBsJ>K,{Ђ୍c9G#¿F^ZzCJm_E :Z{&yO'0̱]2?8 ~/ɧqH(dw~{{-o\, FPЭ|[<a][7i=b+uoHke3*7ui%FcyRbE-1!wU$9XwXw3f;ٴ|gNksHƃ#if:HJPU !$d砈рpoD߈u)OV-{;V-m&b#?i|1]4m{ > v0`i`kX/>Fqe"!mu߫^ V@ED *1|k0=zVmMen) @2VJ"B8K{E锨 *ɭ5؛ʘS30hߓEX[4}l5v٪a#A=EPU,z>ˇ? hFu> ̪lV,$0iŵ7ixi@svߦ*J8j.mhWHWIisQ$j591B|ɗuo@ 1P ༷T,ss%,wVz=SmŦAuާ({rYO4h: )%1Ժ8DyNrj h\( 2b 9 2%ͯ"8hI48m#2z tv3*YL)RrnS^.eG):y);闳ɰM4/?H:[ܙmN'2[vKrןiFQܡcqv@u`J@AI鈷'!8&蛀z brtbhM)C#uLֳ UQ*K(JKb-abeV² 32nXIhۻ}& k/Tn4| gg.@Ħ=QT[K+p\g<phT8X: Ys=,MtP:nGMLXVmJ;n'xI]:n+mam{#m4XQ4jZ yK!,V F=wNmD)ucSTZ匏> p0!&;Mg}%b얈NG8['[g1*\d"b/R|.w%m&nvBʸE|tf jaF+ INzxrqoa)yn'AzU;O)U(kmȲ/;rn=Adv 9Dd;Ibl%`l[yέnՏi-oW?\OG~lNΚ+ו "6os[\"ﱷfH+z~y^7 /n_ʆD?gU{d#l4m>]/Ys˴=ut!ۊyuk-NmhfU噻혗\476JUgx@훦hfkh ]퍷8qU6Fl}D> ̛EiݛW/N\2c"(RhP2">P#C^;$G{?5ʏ.py=Ȗn:d+Thnx]b9z/7?|oJG<7:JB42Hqbu + Snekv,?Lr tG}f`<Dtt`3yga6&A׳黋IGX҆~Ch2̩&Z>Q[$$&9Eϲex woAϚO2x5ma:$.}omy@ &N96qe#M^V7_Ms9t96clys !(hIKB .Vx8Agd{m#"qC¤w,qnzëYȇ ۗe4l_qDVJ˺ſ\7pfϚYyOoӫՌ;cgA|/%]'pyҀIŪi/?kklTAzߘ=U#)!7Hn>a5NjM9g&1L@CtJ=. WzmC-/xi@NFC%OiqQSR觲n'7U,cOfN.KĎ.de-kcX;0GSƈ}%\]#ɝߝKSfkc7:sSlde+ *#ZaO$Uml^ _?]unFI/N#)Co?ܔ2u-w(DO JෳWӫ_Af~C~aҮHä$d]hySu.| gA{x1>9#C]ťmWjy<}YMb,JS{hzgy42tz|j%wOGoD]7[kv:n{C1?\]{iJ=5Pkgסݞ*Sٷ۳QqzbNZk?jwwDg!ݿZ|0w׶]wӍKKmlr06T=auG#P+)W{t$nu9Ϊ:+c.Èw+DIz,e/-6z)ْ2yISZyXV\Ǻ9|s>]/DLHhZxin^R\(G<ɩlK0[')(q=(S<ŚzM8|Pf= R2j=B1IQ$.L %?S6'!1Hls nT:L.,!ͅ b _X&pR9E+(DJN*=cQZJՑZ @>_* )8gF%Q(Cd^G\Xq1 גpw_&B҄`Û1 E(%ZUdٺLdtx43+ZvΩg<8tA0CcHTYk%M";P`*d2-"mmFTkJtxe4Ƭ)2_0Kukwixa=Zh!cN홴/5G."ID ,'mCP.,ĘZyW^*rr(Jq9QJ fE:ˊHr:@mZGΎT]tvw [P"Pg ^Lʨ.)ayzZ9'%BSXU\4 K@Q. l|`fZS !uGN܏8, 4U^(NUxEX$T$E8&Fl&OXb YGcMQXFEi\MHΚ1-d %d x*WX>WLEHlp`A2 \Z _7HVGr0(Eό{^d[ @$ _XT£ ˥r\D_ψr8bx@j3[=5ǐP?H!g c[o"1) 9x&P4D⺨(XuXDo A$<@^a#z;̺ pĠT4D&:#eUƃ% ~a @V$]@8JVr * ub %a+,X \Q5AvRP6ZRFBj]Rl Wcܣ.uT0 S( k#9J8nW3UV  Ѫ CNl j`Z82Vr[ kn!J~scVCq|1z\6`liN!'8ğ#{`$4|g`+eIQ"WvuUD|Q3Hh4qI@?ubɱG@$TiY\YjUgW$<N% 2Bpj=#bW]nzucKPMOunb $ 5ݞ sV iLa~Km! r ug\/nfד~V?jv;z*@A M '&ĕB>7 - )5w7 Dua< t|&M@7o?΅#_eّ8nݽ 5~J)RP~eRDTW~H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! @d){N$l!Մ<eOI/J[$BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $rI Y9@>ِ@`4GOI/* AI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BI/VD3"\a j~$Xi%@_ (I $BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $zE*RWRr}y}Z/;Y}WeJ;S:)] ?.ܠ-.׫V uDAe_6ߵn1MZ1{Qi hV j$j WT-3 |g-_`y搜8U.-*f]#8 x) l\yWy^]rab7ݪerl_ٖ{7J}bx 1A˰sgr E6J ~G &UA[_`<ӽ]vOlNyY/bi(^>{/j󏺙[C|g}Y^Fг>鿧{3ëru(;/3/k gˡdzW;?7:np~ZwY?UKukGo[/ã{" ֗zWC|:uq&-ܰ v4*H;db.Җk[t&6qKPmv?N>-E˲ Mw|}`U"Qi0l?)yTAx圔z|)D:^*LBc5Hc?hTpްuɛPWQĹ/*sB:ty^YL1L;L$ RKbu%g GTaH2IAC:lUj XbCf&-4i=d /bܙmˋO6o-ޮJ?v}~#xl@SN_BI /'?~-}=G*[m\fT:GiծLɔXyH:Ky.9&RHH=zX@?2*ba584ʱ`,|X\(&6ÀmL>l avhT7A7#vS mK/նLƍWɖZiV|n!e y hH8CN-<16dEF6ҡL'f6 163a*M“Z~_dS ǁq~<wx,iv r>ydJ RL6J9:RQh#b5sGěC\y:j^rh\d"ŸqNi35*Wh`܂>:'f d.YD'h'hL0.>D\rhx ~y>Ƴ[EZ;ޑU/y*Y 9+ QՏOTh@8pg#ŃН.cSxۂi.ZjT@z%ͤV8Zbn˒.XY-ֳ'\*^s\jLi˧c.;3`]ꪼp89:d]Ntj*-zOEOZLT|.]G훷onTWk=ふdKf;YfFy@Ƀ)7*2l)mx4?{ ߗ5&( OYꏓfؤYϛvpҿw{s1JGk#Q˒bL(Tlg2PF,V:eV(*z5x{$y|XU1-0O”E&I$b.~"6V^ߪ !)$Dw~o=d.,3_s/?E-xc`ܗr*0S3D ,d/Jw\K lq[*֭BjZC1͋y :\٬$X||EqަٌJݮ„tMG>>RM3"]zӫҫLAp{"T 겯jox }jQߛo)g3K'_ηؗ,Ƹ0bk3z)_MF^CygWW&sh`>ʬ3~ Щ<~'})s.w,wk=0:qRGjKy^K"w**lmYidc`=.^r,@[v'lOvv%Y]VLʀDw{]VV5iƼ##QjQhý1И#5V?(۫sږwx|K"); 2sO%LN_1D^̤g|R6sTyӦwW .S64/nKϦ5?ԍƗu]'UC z7,ּ&neCϣl}mv_uk^Eŋg] bpޯuxiۙ xu/x{\>?-{.]p{w7=Ӈh,M.W{@^bZdZ%Ͱ𥥑kfŶ 2r8ɍb0S't8 ަ(nӿiu^զ;iw<=ڳ \c6}o\j0 egv!rw?l c?Wx%so^T׏ݶR!^_ q5T8^Y'@p+P !䒜 uߟ@^Ivm/+˭Y8~39d $o^ּ_߃ ¯}<:X GG_뿌=@,)d Z~W_@|yVH׿z iIC >)0c]5pIB/ג?헦AV;^SܽY=+SI%>8[nՎ.!} ,2x1n  uƓX:ߚ~\䚞.#Lw?]YOel+;?_]Uߗ|p2V.} }!t8`OC>ʶؼ0}͂{P>('JY(*w2 RIk QHF ŘSQDaK2zw,ieR1AmԖhI"V3瞥9,пX>m2fS<(G Dee& 9KZ tZp5E5Xj.qR_f@$H c2|D|0sٲ*T;ot\Vܮzh ƱQP+%D1cG"]ENRYiY( !7BT:#B$+u&  yV~>}/\;_R[=#"Q/$}V/6RMs \En3'Zr*NV~ hkMikJG[S ٯ!IMNnII6D9g<[rNQ0 $)JHe*~bs:,W}s%aǾ~dpg}nh>.0o.Wq LQ=:pkSL3Og$JD;\k$(RIl 0O g2̗(3_vw |;{۹g8-ԱxS[x vvt(HRQC.ףvm\~]ll/nZ-@(оt9ŌRrVx[#l#KJ}(`EX V !AO9.OeF$|VR%!|IX'^wyͩ1s|;(塧jw ;67鿳 *3rOc,p)8$MXbTS~ XNа6bE*(t$X&:Y Cs&!qcS)R e' r#fFT]c;1ֱ) Kat ]=릋l9K/orW.^mXΡcnjU-MBo7\d] V0S~29+e{gu=0+ձ$uE [UH͍KF%m#gY'OU$ :ΚLDH1yC}L=YpUw gd~(ةڵE Fw6@j5h$A6zΙL3ȥ! ;ͨ^Vi]^]rp|6ħ/wx=oϸH}^oQݘt=QzIx([?m'+d])KzԷ(ӊx9s-Bd%,'.P:9ExAL>* aaD@7B%//BXB I8^e0bd|Y^.j}hjN cizM%甽횕YPU6!Zĩ^HeJ$6*w>Mwwy^_t0l؍-'o>VAd ēTK3|Bgs.+;kg{8WM2R߇cγI^bE O1E*$e[Y仿9xCL:cؖ8_wu_Q~Q懏 >tm} LFʍParR0R?1h`"JƤm"Ѷ?xCQs~(\p`[SY]n{\H=G/O2\`+1aX l*AKA"r+, /' CN1HҒ`N#(p'U)C8pUQU<8@$wZw6`5)0K]_6V(j?"xW C*Xp qG46(Sʠ2ETI}:wj0R("O7g<]^4^|i'g!JB G q#m5Fke{-:`a' 8>8zwD 2b:H@5qH*ɬP꥾3?`/􅰮 2pH.2Fvͻ=K]($UU|Ͼp5nA 5>IpS\etҦs'IUkҫRzv5O1Z^9)Y pȾBӯ8_eU77, Fq< Y4n6yZwZ8U!VK$DviF~X:`eo'7OE",|H>8U-p5`MHܮ:ZMZ= ޢ LE6^]?~0 jCMCjքj)/`ߵI *)7>MzySTw(4Zսm*(#{~Uh^풪\ƒpv^7Uۍ&6n"Ok?|l {Jt}ƽ1fM"]n~Z]?r;*2V5&K`ZU$X=4[a|`Q޼ODtwiw_6֤t88I,rF!ϴp`)֖6}IȻ]7Ȭc9J :+1t5Re-G # Wz(ā.^Y3޸Wdu"+=tv/3b?%ynIĥ>OMϊ Ić0HoO_wp7&4Qh`rH8eFa0鈨&{Dbi&hYIvV(1]t2Ft~qsL(Q ۔Vx>Ew9lVJ'B"n;#gu :__HL׶lуqkϓǧ\~s?< l]k|&g21 {dM֞{ƣ]]C1gBs go~[vU4(eV1M^D`k"pi1%0AYa1B 6)nV2 }4B+]9eSŹrLL7 f) !P0H'~XRaD!IE$MZ":;;#%r,uc6laL}ֳE;+|d W쎪U}YQgu`FVk$Chg d?OenQR_THtTݜh;$vrۆN͋]\ڔo+?$\_v[< L"U;*jRǻ,vf}6YyW@S`R;P} 3ascZ, ^҉v=F1 ɜ>\ĕh?]k{x8="]MQwbu }FקGǣysFfj˺ \H(>zIWJ9t FC93ȥFn1 ap)1{L$ZFe)A@[/5^#:b茜[ s>?]Qz饨ai-\n&x_!bW56LjvX|U f4$'e@hQqM)_JHe@ ,NvmI/:ǂq/(Q4dC6ZYL&H}4ʼZR0%m0YJ ƅQgۑ6La J9#! VĀu]{ Rt՝jyl?=qI;H VٛWa$yh.m%0aIa@d_j,KFu0Ư=ikK{ ~-J_MzywL_SX3v9F"rzN/Swjpe(0y gE-6KrTa3i2'€ y<8O\F"l#3b}ͮMHQ}^+4ѳggG9C ƚ%Sת4Y׿k: $wm~09_R0.[SU:+mFXlRy]fvY=xlm ڂ+i0e9T/viXmMR䣋f4a=gmn)ll-]55Cefy h\(2`sJ7JVN6WL%|[PHj|?TWҎ>ٸ<鸣RmTOl0*_gnp} Ջ_~{7|}瘨߽8[Xu#0 FIAŏ{4jګh ;Uo. :Zo>};/S/ցVӟ"|TV3FN'_ V`qY<4(p:zeV Ƙclr=*gTpf'ۣ3;# FL.W0O(M6}?(Tmi5kó:).TʆSxeD,S.f9eS<`s-4-'1bePeAY~,Y9+%ZsCӵu]9FDj8p\g?FoܯսOD(,=Y4-!-N:0BypI GAYT?AxKLձX|{K`t4~@oXH+c bSyHr0ًc5A=40g:%Ǒsc ) e:*6Aj6`6MOnzP*h>W{[Ro=`H{)^+hs^8gX}fV!AiqnmT"%[?;OyNnDb ir0}3@$<ܥCߟ߼|Qbq٬R?&, gX:S^3ϜEgeoR$S5ww.sj沥ʂ(y>,ԑl?֦"&wJ?W|^2Uo04Q,.H-^P2.+Ƭo^* cgf),R E(Zd חйxYw-U-EM]!`ƙkۆI%*j77k 7Y\P +qtf]zkӹdb=ST cêzd'ly@cȃ\"YP}`VW{ K&B\F_sl6<4_Jc2;InlYGQafu0pzñ' lN(aCRU*>#5jओI&s:+~c惑jr$8 YPJIdx`VX a bsR${ޙq$id.*C} XX1= O6&ubERT.VEeFFOeT9e#CӢ%PHPVYrTViեr?yI>jliMZ)okWj5uQ"UzglR,}4z'\ iN_k-6y_G.l I5K[[jVn›m0YZl&&MjB1&Zj/fBuؕr mqmsQ Eg)b/9%|>nh g6ѨoZeY=ZǬJ{w{)Fix@ZCt[Bȥ1N{'&W"NG/D `-N(g^Q*-V*R=G^%cNօM,iKEHԜw&HUEzr7NTRZ8'RIIE1\2'%d ~ :ƾU29avnE);IM -ڒBu($V #_'tH/"MYQZZ壝9^zOI=k^,>YІF.\>E=uk,R`䦤1KuXe]!(dGoOM6deGގ0H%GH ָh)4" ^TTlPtA[ xiЮmtV*A5V-T첫ԔX@h4yֲtX\c ]Sb n$р`44kݡ7Wg\C\-k8)qXa &ga=gA?1*Tm mkGU((|+`I=RycsN[5mB7* bf2b@d2!h\F9EB< A2&d]g(MJt"T]!z@eΦɤRȠU %;.j!ՠPwVrEALAAX' ( j "<*f@E"3|5g A'ŘT:L;"q`LSf|)*)8ԙݼ'T bXS9(y3XGfұK!\h (65hlDC0GRF4U \Y[H( ;:I?Ն?5Wn+a2tYոYUDIAPDVKzgd0[QUjkIk ˤDpPZil,@E@H v/UTy V"Кe8M236/`VKQEn"棊1!D("_ &ìrtiC4dYt4[jI#JUe ,äc*Jr6z_fVcQBgsA9AJ# I"8d^U0P>xmBV58 e8my4/!!dn';lE^QH"NdviP'7 d!g0Eʟu7/V1Z8 RTFjt00tRuNc-и'`ҕȪT~Qc@sjo6ִxL+5ziQc͚ b %_T4/Q4T Z W-ECpsreA^ Qє?샩j-*awhU@  g` ae+J 6'zIs!y~)3"Q%qTh5]QzJ’=i­6?pQi8m!X7ϦrшUtid"|9\VĹ$#4Pb鹓k|^U =]wrt o`j?zB}u7V6bAT ft֤фǠC!2$vk{7(N?|u2hQiƋ4NSEF 2v'a雵ٴ={9?MxMއ&:H mť~$?}$u @_ dE$@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L}$D?`H rB0'` L}$bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI/!@! s}0$'`L}$P*޾i@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@LiI T= ⡐@~>"+/UL}A$L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 @,LT^=yIKM7~.vPVgC %`6nN, ?A;[W+WuDIo0ڐ7s>?w4]z1`_v2teYnkQ:r豪̽#քl:TQMaKGz2[suuY958=9wPm۸'>]٥rW.;RT]>=z8RDpz7}zUlezUoW  }OO21:v؏'>g%!>wO[+OgVWE7\L)gO_+h}3TYUw΁̨;g3-ݹv8\b2,܊ |m=$ **z'Li['唹l!v6va?\_$D|;|^! 7J XG^ _0iZa}:9Rh? 2հ>jx=o~;\:㣯uG,yhxlY{ʰWGͥ}G0.>}E4 ~֫Bo_HqĻ=xhMMg{|#YG|J{zƗOզ.t">7#U BqtTӋj7<4_f>?eD `s8z+Xa}0qZOph،!9;j:X->?COoz y3־IngF޼sugҘϘV٩8qQ dJ !8xn;in|¾A;ZVAnMJ[wm#IrC6[LXఁO[R$ىwj%YhedbTuYU]zOEfCw\E5\1p@V/ ~)\-9.%?(/ZIɶ#ww0zfQ+1qg\ o޿Û?~K@~|l6<X|^rMb]/XkCq %j{HwKxSh'qe}?6h4Z $IZ[WDr2S Lr.s2;SDbV̢͹'Am'QID=5Se)#܅TZpUVf CqD6l-j1GN%sniV+65:66y*EB=gh6`j S؊@FeSi-p(!1`N-f nxe-LZ ½a㇁?doa/^iNb ij}gNa+{cäXwTsB#Й挨"9QZ DwGLNr.E d6^ ʄjN{ @; `KQ`ڹوl#KzHz6m r)_Ala1b-U%A!Z)TqO!g 7K`^;y;J\M tkͨuyUc ٤:ݟ%(`Qϕ \21X#PZ#Ѷey(gPvVf-`P>{?N}ovB]nO`*pӂX lR.hU0d6TcA6h)HQn%A^4B[b)RriD62eq"o #J '7# DBpg}"w7+5|^ ,yw#>>IS+uܱL0#We)GO^'QsC6=|g0(2kD2+!azL uӢAf]7E]#?: $;PI ,?s5nA 5>IpS\etҦd๓HF*MjjV=zO%@Zd8d?OYϲ͋ C1OG"ga:;˲@ q%b*ŴH$Dvj ?Z`>fCS]ADQv:EX2 g 3jGvj6mNcc`*Zi¨mR7?7P- xA"' ~+oKĝ&TKiVwMqLne ~*$Cx2)#`r!nmqIMHvkjkgBl3QUܡdceHe?*W_^tx=)"6?57Mxwx@n} )3!߼~]W6_ܓÍ^ϩnh߰030Ny \ 1 /ml{IkL6؎yJvX>k?{/Ss{|;(&K``5+I$X=2;a|5`IƼӭeIt6|OE2B&Ib3 y>(C=(LN—Gq+Z-ёY߁9cEPjyjPgTSNE TE+N^@O<7ny#Q݊ vhzվI}#eI,IYҾ/$"~t\Gq(aAUNh$2}֝f:"@` `Bt! 9/tLNSƈ6.r 8Bs+:, Ƹ#aRʥX= صut[5 L*O'E=bmzuҗ}cz ћcbbDVtsoٸݙElN]J\0sf!4 8J:೰@~ٛy'F帣*&)lM$%L Rؾ0ELu#vc,pmϕVhޙ95[+wo~;dpgHAX0K%Nq\G/@r=cĒ +$ Im\4*kc&PfTet)]9b0 NPB2Kpg/[d<yuWoYu#!Kdy>GUtVQ"sGsr9" FȱG=r

.dLI0 `:Q!ݝ/0M]Dj^Kػdw-5W7wEN~H௙^}uZK-x~_ϊ5vYz9Wmǚ׷i_&OB|c PBcGTI;|%Y U5`1$s6g\ lrIWFJ\>&964Oy(|iϞ`'EO"A /RP}b!ns@ $) rfKՍ:u1 ap)1{L$ZFe)A@[/5^#:v1t6.%?6Gi&CZȝO]\_nCO^EsdoFVZ H8*)Q`K  tI>$0 ڰfM5HwD@4:&DF03,*͸`ha  B' LN>TK= EX4x? m?jﵲRgMh01y/3Ràf)#*Fg6FZHǰe%}UPȑV+b@:.`=[AMvugp-p2ZB6K?ſGJ2̞G*'G@sɕ<"'a d4&~,K7u1Ɵ^$,].AFzzRmӣ}6d`k+9F"2bf6 gvt9A Hյ_%P0 8+:mqK ۧ&KX;.}qP˲X|\ίcLV/S&T$*ʾ蕚ŋhcRp o{4/m>^q [H_R%01[{lq{r3¯ŕW׫ٖR̙:,hxrX[vQVvyq%ChHgHoi?'X"x_ bbq9s)>}}x{cL뿿>~:àq SNwYkC ƫzQv͸Gnη\ڡ[Q1c8se]h<)/wM@dt%u]_UKN|\Eϛ_|̗*D q;CjKiݚY]vwپڻ<MGc?Ff Xy4H)ʝ~k>X1(ϰ* gܡ \dXk+c9qfsHRؿ QD:M#a=ׂJMxٻ6$WNe llbeSBabUϐ%E6-nðIpSUUuW>9Pƌ$iꉮPEvxLE \y1ZV"{%MV8Zbn aK)w%6'gYztFѝ4:E&Q3K;p_eJ=Tqc(EOGch`*xk"&DtT#PrP{<7{q}ʘ'~k_] bg=4Q^bs9+\"tdin~' G?? GW%&(b( fg&?…m._ `nK mOetH뉐f!9yI8/`(ߕp҉bzYK}4֢o?6K_6\;YΡ;Ne )AW "Glo)ڥk^g-{Ӷ0; )%18DyH:=BV0`$7)f.n.=,%bق +ؠYlUKُOhdsi^F?8 suGz߼~:i玆ʔ OJ%@[A,BF97g78mMOz"aB;&g}s~l&N4t:ԙ(v+B=ZNg-K};ͦ_eG梙V̎6w/I͟^jf}Ë&5xmF͟Q>̇z6]d_1_^٨% ޷k'Z6Ü˄=dsևC^nW)FU~*{j5VVݮ]A6U6S+Ȗ[pwp>ƻ}MdSj*k*zu{exȫ QAhhjW=W>,?ʿC|kzj'G-RSon 8(w5:Fu֕M}̡…ǴK}]mܞ}~k&$V JK9SEg>^yobzw ~~QfnҳT2f!r ,a!`fFdTVQ&axQNP'@E͝PFS`&&.3ZiXXk˽fǽ_?=eyKwߛ̘Uu 40uG 5wU*U廪|W]UxvT]UwU*U廪|W QwU*U廪|W]UwU*U廪5`]UwU*U廪|WZwU*U廪]UwU]|W]UwU*ճ~Ԉ ]Ve<rʏU}Kze.@sܠ[XbPG_omTDɨht2獯e.e\WPR?՜,,u'w,i=ҲmՙC1Z«-l ymOTaRTN۱w=>~.R˳L0ϣR 2 H%傩DU p&q.htrAvZhk}qFp"~ۿ=B !tZk, $Ǖ{ ' jSB 5ER:.Eq3f*NHMpE}2@ Xg\Fg}D"('¥B˻&,p?7_ϱG,eɴ3/4ar>x>!g{!EScOc匛i"k< `N/xW#y&#ب\ђȝrr=' e㣻!!r)rg-ɕ$L%P1sk+7iƼ#hD5І{cVlt@4׹uQ9)I"i%YD\¸pVL(9yg ,.4:Br Ł%7>>J'b"]#Ƥ\JLr<f6ɥ[8ɻ A^64/oM;l~wI GgQC zϾּ&ଡ^#~hy͋^䑼xѼx3 >Bhb"Fia۶e=> "`MsO\vr8a2/}H__%Jؓ3AдGk[( zd% vZ4'ٱ98>WnRG'LQ'.H3WUf@jqtGG8Z7 6FcѰ==A~''S꿘{iP\4|?3&B7M2""B=81k#xIا[2NgdX&4TLN>{p0&-8/EI!I4w{Z 1xbvvnm)"澭z37xi4?c dWk~ףٻÕd.~%p}6)3cVp]|@TϿfG_zk399Ff⼓,E>)g?QfXɗ#0%`fae"}Frl fz tWg~ʴpYjkzEg,=*EÚC2>PĻDWu/Y72P^zO4&8P7ThdNAD($tF,FN-;KS@CZh|jq ;\_SQ[%4CᬘٙhmE5E|"J/dLCB?(GG DeemKC̄.mn$A8n u3ô˜ftr.Y%v/|WvkkU'mNz:K:VJJb0ƸD:EN+ghr#.Ly1r.Hr^'퐎эGȻ6򫺶};eh+o޳T7h#Q2a֘}f4^d6R>I" Ad+!ASV6aYȚZk󋧊m|gs]>tj;{،{rxNQ0$rOg%lomX+h>$ 4pqWx8%H c4IN=X(\U ׯ՝y.齼4_Ӓ댣ӘI6@ XbTSrXTlx~Q%Z$BBC"̙" чL"qcjO8KNPXa%.6x i2gXb/@.vtUWߓ*qg\>j^.Åmj3s&naWQkÂLkQo3aLٻFUsv0`$=;6`' o!tuWK)RKRkˀ{CJ8$% \q5=Uտ1;>;hԕc1HgrL[:'#dGVDªVqNmeY 2#(&{;vfS)rV.nv@9B=[0#TM՝^ ' @JQ(S<S)8Xƃq+؉p~}u,=x;']{ 'ނo7HFw.B}n0=t`ua8TW) )BEPXz9@?$A"̴ C2X)2NYArTV3N{ޥ{!])[O{7˗v/; V`Tʐ1#욍YqSD1"J=hF¡ v" z{%,mfܩg v'ص ooԞ )g/YY}{x,xdϐ $ ޞLj|`~ϯKAmʨl:'gIh2; 9J22ƆPʶ)`ʛ?8!ܸ TvOK a;?^͗m/Vz{#G/ٗ \>ZAJOVs6 [#27Lj0  b)J[efe!奆$LJ#0:8Ɛ5}=ԧ3Bbynh;Z(U3'6@{.(%Uѹʶ@쥳I9Y5x)R'WFLԫVkצyQcR! &GaMǒ i:1V3^tEG 0*|~I(ӏ9>yZd$6s2u|dݓ9$L*m . !Rߙa1AZXסE? "Qkn \yOdGx %^t6WƁqCHh&$r%D*bB&Ƥe6fcrIƟҳ^oϔ[z 6rA-8W8O8MWΰ{ '0Ԗ34Zp 80Ok<׺?[x9FmX<^tݳ{koi}7%j}"D1{S=L^ľ{ 7 /XNi+oWgƕnqBjM\<=}ޣ!' j;hSN:ح6om7qː Q+ɕ㏭kڍ-eO3ʞ߀ _Iff]n~W~Ht0-"}loŏ՟"r8!p1ef79^$5TὢT] Ng;\pOdC 6͍&) v^- ܵƲ'i`tOOx -!Y= 7[&Z̈́n],wBD^kTTe/#hKʛ&KAԍ6S6'츢8{}Bup(㮨Ęqi<;_H >EoryI{!59g&dύa1I/|JMb2 d+mdb2>Rw'BJXJX_dR: 9p%Voae2wn;#g7% ]iWRSk[\hy4Ӌ߮/6yb~$/)d\KkFȝQ,W*CxFNqa𳇟=i-瓬i<:VVCD2 ǜz(-cV ht9p#&P\Bs TW [LWPa:F֝mǥ 03xbz6ܠCu 3iJ$"kЀ2k1woXp"-ɘl.f9U)gbN ?!HPӰɚίjKz6+`S?lgOo4\Mmgŝtu;k7[jі{ #hTg4G;[|j+ wduIw&=#r+i*kwhy);@dOݜöL*wW;L/Wu5d\ۻn[#Er覦[[^ݝHhwg`e#[-'ѰdsYHSNV#(Bh 2U`S Dȅ`I*RBIp>*.8RI(CYR:#gOߊϏxT>=NB0VA3C6OK^fR~״ܫd.:-$Hdi/%'o- _6k JQ!tJN&AG3\{>KZ樬JlJG,$ XevCqǚqq\(N1>}J2B@yʁ UJ[ ޭTEFNjV oFif{niAr!WmlTpс\s'ܡcQ`)#ddbbtDRb u+HO9W P'ܑ1PSLp%'W)'a6Kݜ/OpogE"cH[:LW^M7 xVgp< rn:?NB41Z{NUIJ_/jƫ5W_/N~Tq2xܫ~\|YHz)₶Bs}7X{fkwW/|s=h>xsm ,[b.oq /k+q.||"W;׊3B­3)y˶iD4ֵNsYޒ &OjUN⛣xt\|9:`g]>d۬mʥA.eL}LczF+Gdq@A2!` .ߞ}/[|?~g| wo~ :cᤕl,sGL[N-cEY}U]>r˼?l}˵A; ) ~qpKeis5*U8 +E_Z.'>TowxjC< tdGa#C旻}o ᒟ}B.htdb)2sNh8lWcyyFT8XёY<]nIZ%H;zsRCW~>#/y)-})(r\9L"%퍴^"yNk 9HrPfdr`?J]vO*uZR=0cVg1VZm@}ljJ.F\Jkt9Ҳx+쑺BoGWmO[twR扴|QEqkp.\r(3d 6Ǚ6 ̂YHBe2RtD ؈Oh*[ҩ$]茜pc2Ft' 3Vyc?àdPo-˦(d#' eȼ,Y2Qir@0gEn%3ɭLq$SIZTrJR[LRuNʘgAK7\u*g @&FFTYLd%ph-&FSeh1Bl9댜=vy-Z;gҦ7z9+ 2&՘-2i\pŧ%"zR>ozc ɠD:7 :DQ&6zFK.7 1ICΠVz;͏(H/qTqeM!it^t.hÐ> F4W(:,'^aY;٦YI%tzB4&Jad!\?{ƑqdG t;6^,F]-ˢHE#KV_!Re89tU_MW1 -F^DV,J#Rݮ_%ƷGf: 23g(<ˉDe lrƃ Ud|c 宝㨵B6U:Է-I \tHP Pz QN C$׋dq}P`2 2a/2d|x{ K:u_՛~ND>ۣ3?8ܴJK_.DZ$G7ߟ7 0z^?zwyt2K->e$$!wB$V{ |FdZQ <@տ74ʹjBa8^5oOspgE$ FYY+VoZRnKWSJ nJ~}fZF7.ŷ623iYځڶz Bq1.A25i{/H߽h<e7_ݾV%Q>Q,hm4D /J޷5vڶuf;gE@l i-ڶֵaD󂧭ښV X $A,--/@ }w=*WVJ%-:Z00BɈ>yn挨#҉3VLUFֆNp^Bov\{I^xd9s2A7ZzO+!T' "LeߓV>ckkY>oW>ߕlږZLP{1bCB0ꛋM@׎~Qjh~vm)T4hhm6rLs+B~ (XH.IL ,ΙY܃wl @H\58_^w(*wg{1)aYM!'ѳ#. 8i Dð((m**f(⼤XeQg'UI<2+$ ZbeC6{<[xzgNZ'e>/ZDW|ޥZ.?yzQ+] bs.^(FD#F:0JfC&q0,oaю^3GIq.eEiK-& Fε TmXm8w{zX/26/t/ܪ/\\٧oO?ggnOhh4vg_ƒO3zϳ zYCN2d'1rgpFM6 qkV.405iI@XﴏseUN 2'|qƌUčQ+}6|X%r3 : Ht1&#dk6=[ vf\묶J6__+3`gBzf# 5*Gv>y|$'tw/nwZ7dۂ +g"GF୻#% wV=^IɆQIH' ؛Uwvv5WMEfɴ,@PKc`Tsv%Ipñ|ݞ em.q!W+Vb,ikQɘu' ,gYiKD@QҘ8 9y"90 ^W=tnO2PKʹ#ʞ;N"(}`1rA?6#T -(گ& Eoi'Ol MhwhѴEgxr25yp&]ؤIӍO^^uA3s~v\H^n^16'a\X5m}!zijΎݻ 4a6/}I_m%JINдoŵEtvbW}GB1X`^}9fYs@`kTsZZ&ѫҊ?E'u$ģ;‹,D_{u6цqMN "'4f4jzoBQűo?_t޻yT|1 WEcbn:#J98#|:)#&Jxw ,rƌ1wt&G%^:<^wFxG}6 F#qZx8r9fM!^sO=1ZFJn/'ӃDe3 FcY!bFrsRL.`V8bSyR}*`ԺtҽN,m7Q6,r)>1)ɗhCpVz"׭Gw+ 0F`^,S2d&fO—󎯠h L1tu {r^~obhT.aTAV[O@̅r2SAbwG{ԏrk1ȌLr!g@q$HN(<$+Rg䶤/ IuKs>qWDAqH\GCO)xHYt&` vuMW>;>Y-#^4G8mҭJ{Jcw6z99)|]1^}&ꭳ~=U꭮gZڿ&ʚ#*+'Z1<CYsm¦]Ax(kf}HptA$:⌈%͖Af1DL έy8y>uS+3+nǍC65Аjv頝oi>ODr@JQ(S<S>E n<',-`gNvEVښHVkL<^Vkv N_o@I=Ic[+7ew &ǡu̸Va-HZ+: O$M69IEAi)ՙi-!DX)2NYAˤ@9U*KSR3KIR&o̪zᜧ{/>~y'~O'#,h9fo1!cG5abڌYQrA5UWR1~;q 9VPJ v7=~mߤp0)>WEV;+K(=;=(v%w 'BZojs"hBQRѤ8\|qsFǕ>4ߦI}zsݭޛƓmxaYRDft^,F uV:@gTFh]?D33јdZE*ǓVS…b6p[n@ pi C&$CLҁTBG$<`&FE Y Mi&r9CAܟ$8wrtXd[mV7~=]d=xǘ.p <䲏\V Q0Ϝ[nh "#6j:~W/PLs77'`DwtR[2V9KO][Likv-jY$q&\ZKZxjᡵin3CQ )*__]OY5Dq0<v. Y5$+yr`#7V߇ck{ l.@KLR>仿 %QȢAGUA$pA D.r<9.#n>j&sj4PA˶\6w B2rhCn&#-juI2">;;lE,*3"Z6LWYK Z[sm@Z^-?ΊqfPi{g֍JؐSQ== QAYCNr(f9*Tΰ߿?O]"`<j* hI%E|BM4ijY2 TDāHH2m(;[ւV/ZЋl==(ll8-F_88sy]_dd^zhZ$俰P!6-lj8,ɥ5v%VՊsg͟F-P x@jOɵJ9TcwQIQR+]תRn55RΧ6.8G])}OЇŵ9խYlfV_f>xu1=*8[B jjĜa:֖˵b(EV^OpqRT<# |aX0:KzX> #8U2 W0bGoBO. &gl먌dۨms\ ړK<>}bsl?i1ܐcS)VtLk`8<?}^pD9zG޽[Oe+ vg~ى_ЦqfhnCkd\[}r+vwk@?]|}5 oYA&B.oY0 l~Vi],T7RR,nCtT{@G9Y2*G)􌑼?UY6o }z C%?j{! `Dɱ2cڲ[c79htWRzj%h (ц8*IT> q.[A{KZ#.+f}m${b먒ٱ%O0"(X*-Q4RjHN{0#9I3T'P{b3SL1qQ"y%\2ɢ tP E}pNh-!D5ML '{tCpܳx\%KV:DQձNvzd.N O5iJ1θ~/r+f-ZxvSp`hQ[5'߂ȎRB<3phWd?#XL+o7B&hA\ׂkub^}0d/ƷlWKk X@&?i<:+>AN,aO[P-f3#<7Q%PD CZrX|Ð݈3}yӁVn_:;mgr8M1&cXr1(WJXym @fcalyz]mԽ]=#%3[/S;+f~WBkbW)>MQ-hyۺ5dQBʃrv?nnˎqKm(\ڡNJ.rzQG8/mf׻1 _:-85-82oi< -!6pa{W,!PWsWPTWk 'kfs- u;&ٍ hY j?k:6a3{H`2;&fRuSm@_XurEDta6zyOSkIUbj[σga6K%H* 6T4S?3ͯ=O-Af IN3PB"BIfIFy @".ȕ5@i}b;ILHgcmq ]̇]LO+wn?lބ&JG552hT$2'IGSPȹqk#i,)v,- 8TY먋\88 ;Lc+']g팜g=>FԖYuك^{\=FB5{,3 k9|UhĚgBx=%* c9(h*K8 ZDBs\UA1f+Py΢@Of.#F%3Ā6rX*DHA9j%Yj,01쌜"A#9+0$&Pc\"@8#*;IЈΑG.LM B_pa.pX|_Bnyt>K'[LROƾc$ڜi _N#rz?8?˵"b,"ZuH ug: s'*ϣ6W30â^@+.Ջ_̂d`.d Ϧ\nvk(}ŷŬ} 'n0.r$^Dgo!,4TTD^[e$q@ɂ S#Ayv 㞊NC(w16$;lgQrUl:[NuGoiNG6ͮqo+#f~nFx-_|he>Zdj.6 .ڀ"W6Kj+]Vkֻm~$5aMQn]ZwtYTjwZW]7zlEe󕖫C6|>H>yțU볯鸱Dmwyze+͏f3Ո={7^>aIv[ں -m*斶 پaQ1qnX@^ 8[XAb4mvf|o_~iV;AC:!$>J)(c_e~s\PE9]eQN s2:͜Pd4?*O/H"y!my>-&RtnjI3bm6SL>*%C(LQmWݕFu G*fj݋&=N5禬PQ[.OْwQ%;tlX(/xkPUqCxQ+s8(m"%J* 9!.τ;wvDϷr \w.}i%%%$ri;v-]L,M&ZÜ &kMboqwY(tTRxbK`q%/hveC/ ֖*- wzhS޾-,(m^fY&cEK?1 m5/Pކ索γxvRGci !Ǘ5| wmN/[zY1cp+ӭ3<mHܳ$ fˇq[M?{֑dO;o܏ꗁ|.6"N"g,D<>.)y%Rjǒ`;%YꜾ}ܳfݗ, Z%#>Xx:e ?hfcf4KOtU_$sL>H_Ȫrugl¥\DL&WKqMs) Ze>6G̘ sVO|2||vG[:Zu#Oݾ(Oe}s|wy}ݷnd,oUV^ fX^^ɞ[Oʝqx-U [VƛM ּP% >*U KP[.-5''g9^-GDy}7wO˳mᠳ:XK@%.|jD+d|ɼ2x;C?\B|9joUZK4$m dBP ٚf} Q k)R ݲx|y-o"j^kSQg:>-+s)ZբBl T}Kv)DUql22 Ib7VmM׃~ߗ b"%$'12fE,Me\L\xHjNC5NFiܤb`*;ӖBy=QWbZCt#糷o%_j͘\5#jdit2TI Bki{,u_τb 3.Hx30f3NYVIʔE8˯=„{im-Uल8DEck^:gICR [AioeM1jD䒳"acY4V`)k#֕RlT":˄wh\ zXo+Xr E #1٭(PiL@

T%@H/%qI>$Eh8PZ46 je| $zbҰ Wmk?Z6i2|79[v3 ~)c( 9i`>f!*SNdvXNR!&`"]e;L0ֱ ܠmXXgޅHМ6D-$L>Fjp|6,YxW2z\mi>j\U>8xdgܲƁ|QWcAZ=%E@i$A2 !U4V8 gl,9Q8=GD%YZ{[yx3 nCf ⥻s],TGWgjbQI9m,9xY'O_6V`ˍ??;aǺG++q>e[̼JX} \{#B|t)]L,߃+ 7D. Z(|PDP-fx  L)ڽXRk0 6#-h' yY.z | qsWĠ3`0cduJpӃpMȒ,\\v"(N 0S"Io&CdbzcUjQc< üO(@AV ?K$@ l\j%J!:ʘ& HZI0xf@mJ&QSq/ۢ HX6 sԭ״ג&E`6-0ڽw7vӫ3/˪4?8>;kDaiFh%FB`a]ۿ"'4Y]aYT-G(ZthƤ`듈\ eWZTH{8nX Ȁ5ȡM!ds \A7Vh-.?7 Nw]r9V32P"P`@H H*,m􊀔wkf=&ÆQ|?oyE!XIBivַ7H]7yižb'J1-FjJ xqfu=E =J6"\D:csjg6jreO 8V׬T% %K:IL\@0 >Y pStryd}|+Mx"07SAj+v'Qupk^6uE.t]`<02ja/ZC iDF:Lr%q={fm3gc)kĪh.M/2caVM5"0rҽk\7Y~Z+zG[V}TQn}ԋB{ή⦳˭& @5'!L^46pՆ"y ;0SlUHLSlVX?I++w~J+&PJ{CW<@4SҝZ9+@t}_POk#Jڲ܇לɇBۻu#@ I]c텭Xk W>- p/ tFOndٳ&>Q6[ْ[Ji[Uuzm$R*Q,Sl^b9 DZh~E3_S_.K%*ɻ,R6 kN+Ī6 zWdFr|jM q |۝;;|ՑV]p{9ȍ.b.7y-޾;;yoQh{mbo[{ݵ=3z}9͞Sa!ghOa!N iGFBBV^G ]~ Al~v{n~XG OuvGӑ6eyUCF ˻ЃСKlݠ,R]eM*4ޟ9z4B\M==uFP/u3_4ym4ӳOlnLGTOM&@QtmgXPlPAZd#e,)R~kHq cn_vz26xxt\l͵.0o_vkK$j,ޝ3"!"!"!"!"!"!"!"!"!"!"!"!"!"!"!"!"!"!"!"!"!"!"!=h˧SӋzwҴ:V9yZFԚ~,?; MCM6Qa SS\ChEuuر}C>zj|YN}n~N]t! :5ݚ4)iw˵O/׏= hixp9'̧K@ϯXG`&?x]v=2PQ~.G.ϩ Իz6n`4O $nz7rOѷw7!!!!!!!!!!!!!!!!!!!!!!!~oyH쾻j46b>=^._/.:5ENȗi"MXdjD*U";?Kusv;Hu&힮^Q˖v<6O4&,_3!JػF#Wk9 %|A`;{aSD$%Y߯jOjIzEq͞ꯪS/Q>7r4cxuBA֑pG%_Y2. bm=ZMUǷ7IM9v9'LQgK{}gkE;WQ0WdJIDS #Op)0zE-Dыj5c֌J1]Xle Ma]h:]p>""Cj/6@u썶w;x jd yNtЧT[ƫdXa07 igQ;SFuƆ/=3Pmr枊:mGL$x) g~\21Ek[jm[XkNkwvۊW`,(s`X-HV]`yK! ,'ɭ FO9F[bEaJ,d@T`tkb"ALP cvdLևņ[6F:FP1F,>ԈՈ:i +^(OPZEH%kA[#m0 -rtS6F,mYP#nxsQ{Ћz*\,JՋ^^o'fjTΉYKV&>F+ >`^|x08}ӇϠ4rj=r7MϧڸC`-rFc#fUJ)Piq}.=Y_USLP SaN^6&:Tn1!ǪW??Gx\`8{/p0T+JrWqP/RV_;jw_׻/w?Ñ{W0{/B5;k.zvikSnk@9\%6\&;4m/^nx񭻄 k}&dC߳f84?&M"BPA!iz]zi\Ǫm 'j §j)7ӿi]&뇄Pq mЯFpjdrᗫ^1||{35i4 |;y#Vo&BU7 2fz =_.b5TݐFłrCL"P c7Kq{<[ E+^WcIɌb'`wޤp~̌av!&26I~BQ?hoC5u5>4{xR{_Ұfy A}kO=6ED=xnѻ ?UlCN $2 5=knmUlXaU 8ӊ$?75g[YsiDB:^R*5 5DL)e@P+DpL7R𠢷Xb$o|9 z~GsJ7mg¼ cP hz$d*`OBqWimڃB=g.`CNz9HEJ˞c)<=Z HDhB&]P[bgQŲ5" #Yj8@m^ ۓ -jC |}MQxrv.и `Z b::xty UMq w35_Bi͆ f+jRvY\n|ψ0Pޞc]5oVQ"57.!y,ŐOFF-(2͚L`|~Lq/,:z.d4( TNuf ;H0'r۝;R8߮ԏA6}IOIG(p#o!ٲj-2\sO>Mۜ_FT] P< L?V? 9]'G>?7 ZZzu(uGY)fEԒ}P;6nz T2"݂ -A-.~kt›[h$hVmFVNW~qujfWas)FqN?훪5[F IiK m~!q<9  n>WA' RM)_̱)N=pTkQp| Qfz(}$Xn,)}]~6r"7 &C’K֧T@kShJ$# Kt7D(:6zȒ~xgZO?krB3rOc&ې<DK ךj` e,'$}w0Q`bF80/LKl?XaŁn%Plfi,-J)\D] 7WE0!A0`bckQy~<8iZ ~-o_~Vxi%ksqM]lIv _Uʵ~0oa߿_<5>GZ=j;xottz>07'0ڱ͉QR(.g(V0?Zfcs:;,)FB)Ѡj~o.x|4F?cBQ𐟋U2L dMC7Rm 5*v.s.)Xʻ`"zxT5L񾀢**,62l`=x.#%%33xCmJ +c R!huTiR).Lnfj4PaW?6H'I{BeaaX޸=aK{P@9F Ӂf,(Ѳ?)L j6 ݃KʻθoO)4\n Jk oLx78sV?r;K+l\'*]/ )^{ m3M5lɋbAMR*}bwh]Cuݶ=LgyI[gp8]vy+=/'in]`KY5 aj莎,w¾]o4_۟5}h 6/t~Υ:.eFDm\Ʂh"%L -Z3"Jv4Oo>!zX uzS񻠻bzZQsQ) "= w%q) 6겕~#:2mɆDz({gcHVĸgƁ4Ls5 _kuuH"%Nn蓫(/ݲ6s*1%AhtSGPL7V`j IK-c-G>Jg"6b1y.@cv9.bTVl@m(Yά+!3[ǷЭ[ׇlQ+UTdTR'$CKeüʉh"q^D]Apgr̤|Qz|{\sǒ% _.bja}k]9KjpF 4IEIlr@| %fo Y#K%όؒ"eYDzdqH6TXOXn*0= `_wC.+m)N3qчѪ'RؠsCǣ`3wcʅG YlN5T~f ~ڍS~Ե5MV Aѡ:1礎֋o;azRLcߜ߶[Y]RhUrֱQ#CxLAБߒJ6jDݺYZU?1FҢڗ"GΕSɄ RH%G贶ecFK-fm*P{EPM&^>ΛQ/b塯n>y2VTϸ" Jt9ҴxnŤ3}v՝}ˮnHv sf,eׁ "Z) kҖA%)a@rR$<# ^꘽䬪B CSĻe"+Cki61b.;D,*pԳڙE 6*]ޗ8wdM=ƜAE j}v $!\i !^j" C~}98TW*isɠ4!YV9 ]UڋĵL3Ѻ Y DΎ3)Ce " G*at_qE!Ao[awJ?M ǓRs[kmqj^5<ZsA(VUd/MʨH:a$b O39t6ˣ  Ih49< P()zN ]91<(IFEon'&h8|f>jĺ%|!)-TdRi@*~YX!jZŗ6r_[e -:tv[Q80tߠj{vCKY /0E7ϊʽr3d+9 DCYAk!-80h-crkޛ]ƪr7l ONX. q=!&m-CZʹp ci̘IE\o8N%?͓> ?z:F]\ϲ J|>{Tz>4[1-Xy#MG!.]yMwyU~]#ܼ\[gς98F+XRD őq=0 4KˎA~e q*mUgu0f,QS*ɀ) #D0ґKFy`!g7zU^QM6v:m*; Vh{]~DTFߍ'锴|Si<=l_HE0* }hB^&s%kE㙉"YFѳzF}q, +§%&0On|@/B38Krp0zu֥dQzR,k 6[&lQȁ{\,zX&smt[m8[- Oq]'pFk/g/y}`ĺEkG1 b:1OC<%fKbw<\} x:7}P2"w&6Fܨ 7F;g?{j%rE r(YD5-)3,¬ SK݈*tvuЭNcjdQc}c+οޞ ֡r%GHFA!MB̭$v@l{2sA=Jz >CIBwQq:Hrd+Vg=>Lfյ_{)Zr_{YJ=Y.R@aׅJlB:2B& )g6'el؆w˛`&d$%!l$]Kk|Ps{̆ ]@ m-!5B$S 9RI(bs=K6Al^Y}Ӓ`6RemcFjPI!?d H#rv;$f o XUաR,1Pr/cƀ˜.%GuSYA Ҿfz9쓲 >P@d`DECZiPtRsZbD R0L9N.[4 ca.6dž$g͗E؜ϬRB72>R}Ǻc 6҃Yx շva7U؅?S3 U!ILAryd<hO'o%*Ew@W=frbҮCW՗ ^gOxA!NOp<}eL+B  MZAq(c_Wf=|D)g{Ow]w7>UrWu[Zγj;i=*1YdH*vx6+6mAkzFJV X $A,  ˡt !c vjwQ~YZKIh s5g*0tV:q ga CCKU-Z}J7t;i/*}I@<PJ[癠'>L'#\-՜.iP*G}s>Nj2*oܨ||GC*tW^|#0S@B};R51dR*x|R IsZe3nYSff+q <۽&} ڒ!0>( ,כ[%Q-V=+O/Ftp i-¨-;i31KD&4jhTh'*戙ڽ͓\qNc$,$$MT`Br`8g쏅=xW8 _f[7+"Dṕ1R +\dR$CeQ;ѳpLhR񎉞f_c9䤐:zb$ #Mr vZah1JsFY/7 i@uv>k^>% " HI4lZbeE6[z`)ևЦc]Kg59ytdz˳ɰ˔,H;n͛$o&i[jNQQ*V]: Q6!GGu`J 0L*eq"-L;LP2GI7yx)K.`ti& Fg82X, ّy~yId^ jE*8oDRP1l"35=]W] |&#նelM-c{X5[x--Buj:;/xE1їQGZ>j/Wnb3钇³6htI>J+pD2J"X24 $ AHQ̦&jPuYCp:G,vI[np(XfCkjjwv݌׆`DCR9 C,˚ Mo**Cj.2*@ E0+m| Vq LȐHDlMB@QA6!eE.֚8ÆO{Aø-@Y:Z,*4x}@8KF'Y)b 8\<!~ ֟p3 7F 0 r3!h5Zhc$&MFzIgCl;5q~Ѵ6:[%e;ŵ3W.sKDP{Ж dQ8KУS]r`] HZJ\Mŭm͎map&vy7ElzMŸAp]3E?Z2Mj3e VRcMLt]Lp͒EԺ+w^  lk?PWQXjXh" 8:o!K++ ,䁳Ȍ4%y"IQҘ8 Zx ̂V"Y1 vĹcκJ Bq)K=Sƒ=>θ7BB H!P΅\m5Yd+y\H- / :~.cGoS>8{?{cC] g?{w0h |6 0ow@(jA¥egyȎ537*.d dc#xvT&C%Ԯ8ΈbՓϦ b.A Go6ǬYl- M^o`HJ?C4^6|lX|}7mr&;iɔ1.~ڍ? ;ƠH+n&85 IIR>~/h˟c5jtx~̚2ݫOܸpRfߔJMts=s^^{[@gz7SUr!]"[ ĵI OSY؊ن_𽙅1[g*)W\%tOGVF~`BTue(˙7ZZ2l2gЄ!X&xkQA 2)\R4kB+o*rlSw5ECfƩ*e0r x40-,i \XX9E4- 7 ]ۚXKDd^ m98fzL k L>čNݪ@%j[vrL Z2$__p =_;L4dkSQWwӿ-.̊eր^A _(^S97?Z$EFe I3$"I`L  [J@:3KH&w))#&p$G2PjMہ~߃dpQO1zx1hVq+8#qz2cvFj2` 3 .͘;$SVw޾č݌K -ܲJ丂bDA-y~889~z`@C܋󤎦V=IKWJIdW2pbR팂:Z+B'w9Kwx. 5]*F o$=&y y˪om+.S.2l,6g#E1T]:xI?aOݭߦAn}sգ? m}iy~QRtr=P J-,WڥT~"B})O~B?fA?L pJX_\l<˵UIIV:pR6λ 8wI H'9PW` k!D׊̇ULxw|%ι^)󽷘ݺvmZ? tu,c[e)']%-Q#D9f1pMfIQg uM cn[+C{c#{~onJ&*gixK]*l{ެ_[J*I Ks)OF0e}]TBJ#\gSz֟ EǫH%BF H՘ c @]z2{H^uK_y7-Qd+L"r]EzHek[v<.{Y zi;[#>+9jxYBN1)o|<*~`Fn7].ó*kY㖉'|ɥ=vg\25ZJ.Ϧ[ͬ]>212uNz*0Mު9)8wFVwmn͕T[;?i8lG/-3̧=MƇ7#ڸ~cwg7/%S s rwp8[c(*M9?klp8xM%WLȕWJǫ.#V_ؕY,H,XVe:У5{ǽu7gʫ ӛ\uUʥڣՁS)V4}1<#,ƋG|j8 aI99zUt>v;Zuvh00ʕ"GЏ/mt饥;\6z2w.y+ڧ^IѸ!ᴤBS6 ?HU~&5?nrS\n>J+u% 0 R88c1[S0 VOsb{S8n-4a7|؛ A^E\gRE7+~tpx7s~ݴα3<B2=Q"C3HB1yeGK%)H&*DhSGnmly)iPgJ`ȑs"%4Nb zV)7l(ɲၙnm*1ڜx%u4UE\=OĕN-D0+ZR)dǑ1^CX;lL@' tU7mv{J>HQr)1dam,+r̈h1,d4&y˙< L0Q[c8F%4mQJt-Iœ(7 xm$}p-qnsņnQl~sVBbzY*W$?wM1bxђ(*^|Z ) bT/]/kE<`UYXA#ƅA:ulaP$VgP˭\I_7~;2(f< [C) !0vZꘌW(:,'òvw٦&HZpTy.XhIcL[ GY*uRE ]u" XQ#ŃZͥbX HQ :q:萕^qlY!?:vd|cOn&+W:XF'H33;en%@"4 } 6XK!?UirٻPF r<s4騴~ehѕ=.i0}Et_vdEF+'bP2J߰"ڼ}BQ ~\71e: ɬi>o?WVOWWk-CE 8)OTRXD,VXtE#$^>0^1|MA"ǜFQNh#S'5<* (*HN!( VwvS{d5z i9)t$v+ ݭbs7ui^JZ=2%sJUf*i9('I@ >8+^txٕUO5ϗy8Y( \O8g Y 2ĥFsPh1\+k3&\zWAgO^'QsCLm$X`kl=9DX U$Y! ;K}gRŗ9_Ka -ZdvWpH`hQE][v@hDvPIF"{f_#[R"XY \)B`431 BG1281'j¾)T]AH"Ӟmʇ-3*m+x$*>j 5I pȾAŷY|7,Uuô8ӛXfF8U!VVHʌ|]kt #ˆ=> e<,O18uu\Ͱmq*+c]1c`*v%ŏiP3e ^2aRD>_+ݓ'HgH ~KA~ڭM7۩l9&e~9[.*P=O&ecV?nH6ظKCBƻCۍ 5^#Z6(\ӗ|&3)fCVmhR{o#Ptˡ(MPFŖ>y7m`2Em[7|dhB䛠 [mgohfo ݄1E&(H&l1.;ݺ/ DD#Gkz66 ENQ3- )֖XaQ`'K#zF7"q0"q::H+P<] !Cc#3YgU SEhAdbdJ8jbt{%˦UGrX£UGm,oK;dJ)~Fj2)F d?*IytÏ&cc<ׄs3 clPn G }Ww̏.XKi=i>鈨&{Db0`ƛ f!D'a[KTSRӔ19r8KȹQjEw9lVJ'B"F݂hc ;;&sm=ӫc>a l]"5ࣇ b2)b^w[LK_ܜx7z<ݵ`wp`(tR9Wag?{ySr˓QE#rQfXIIᠳFd +}чG)Tacg_(:VQ"sGs`9:rE!-r<ˑ UA+]eV[,©^ A9JDHonޮ;|އf<'pefiYY *-} A@U\MƠ$HEPIgٜ&z Pu{J:8 |vgh/Y]S쾭S΃lG#{Fdh\RGx^&d~/Ҕ٤| ߙͧqYS{e$l۝4{V& =ƴ1O4ZsCr+x$θ1`yQ.Ge/_xy}O 2g0ĦBMayYG"kY"M9ybyaʋ043<|WK+n2oa/ĽK*K>zMAE Q܃ c`bҹVhrb#Vu*K\W\,IQN)93Zt&zG+KvTF`(-D/r[p&L5XRxXNI:ؽz~Kgm/v~ݭ3z T^ە aW4 旟^e1loJL,dKMƨC^3Naހi<,&*Rwȕwgw'L^033CgCSQ*|,ar| W20G;/Q6ͭ7im=ÓBSS\%yqo'g?qC8Ӱڏt~Qm.'e${AJQgX},VgY<}OgY<}Og|䁱,)F Y<[}OgY<}OgY<}OgY<}Og=mGW0=cuw`2$s6g\ lrIWFJyxZ#JI#zxsZz(#j^y]exq46qmc'<GVgz/! \H(>zIWJ9t FC93ȥ0NHXG"\JlB O(A¨U1kQRk ۺR5cagj9G,Ev&J.,[ڎǻ!ӣ]7)-rG3w.!hrj2 M4⨸D /R9A"P *EKn8/rq:R")9IE(fEwZI%HD0 AP$@2Ɛ X20nQmS]?jﵲRgMh01y/3am)KaPh HSBjkEШ`t ,\PVr1Bt4"$X6ޣ^};[;ƣav~~kqI;H V>/`+W JL0L4&ɾ,%*FIqgSdWN\'L/{}<):(bي1_EJ#fX1S`)"tz)SSoN(0y[  ΊN[lB:!zru[o`B./ѬJil|uECކlbJER槺R={v6-c0x(X*~vjC6{5&/^ma-$L/nT)ϳ;TΪ?TO^Ϯ^,5ĀvK̵8[ovQ6yRWfa=gI*myHm>T,3DFRGЁ,<4]܎96ҭy%Fm{WL%QtIs`2r^YS{z,uنN?q+`ǿӛ/~x2z뫋_D]ۋ7]/0`c8m%ٻFr$W8$GpۋyAcEcQִJXo0uؖRZYDW\V*̌ 2E/yO?CcyЀzN2ny=[p bkg@?_ӯL6_kmnn:GMΊtUwLi653|u7"Y?+k}SB1y@QU+9#=Fp$JeJr_cݹoQ8}P4)FLƬ!76y;"ʒ1iIv F"`Ԛ(EC "S0BF[EɳcSZ|}dg65ߡsgnC˶"(Uu!/q[d$RU 1f`ԥd( 3OPԊw)OenY &If-drM=% QAԮX-zh8FYL'Ͳ~ "QVV&^lmL вpHb*`.G݊ϭYJ5Ss߮jyyPS$aC"+D_DA. 堈?'/ds؁s(8Ziȹ֡[~qqâ/uHԲmATlF&ޣmߢiHk ]5jSZm7L=O}E3x+`ER$1tQ2Y'IhooN 6핺45xbN=d V7wޛ_]kMDx{K_h>*"V`c*IQYMxiD; YRyjI:ʄ&>$ aD6a1(` J`0;)MšHDmt훙v$kT^(.8 펤#itJmUcl:ÒM\3uWyH>^ #dqu8M?FM-?󬞾7 }1@.PND -Бa  RX'\]|oh I '<:SmF] mL"'/4iFxpU亭@xJڬ."Pv=ya%DCҭřUp%$aT9Ucqb]N:l 0K*SΞtG1)'p] &޼C.|ExQ+V.z1_/ZBox\?^q4ٲ0!EccHW?o󗎰@~@8`D>X}īA!J`#|UErM\arBl=Hq.չ y?cO5O,˳*b}g{H{D٬ 4^קU=xW< Ӝ+Xy@rNƺӁQG봧6pjvϔnHwJDAH l$;ZHk=^i'֊3,fg#@ \(<ذչRgH笘ôCio|L:u7s%Ippb&ޟjNW^#]6__3ǷS#fř+Oaʼn쏯>-/[j׻ӭ?o@%ͅ&FB̏YHk.ӇypwV P'dUHrNͮ j":'06)xPYsi:a@6 c&12#+DғzR2ZW]NsٚP 9'(@Y"R$ YKcl:i1m3Tml~5D^9`LLd\)s~P+-IP|̦ F֟*t1kb#q xșD t~wE[i _GELOכy+T\=|}NjX'b^:nd W'ۤYmUg !ZS<TT!K59g9*1AP9X0Ia0ܩy:TF8`* 1qxR)䄥rBPIͦs3*|a38bc_/|b]ݧ{vv[9bbqe1_};ь +X@Cm" `@$$\?D g>ض[CMBR6h9%)\N/J-1h5A9L;ھמ CtT)Q9b2va6DUJ8mk-B$4RsPj΢#'kRѪjQ )Q@$.1кW]R-䭎q4&[ǙqSW#[>yMM\%(~}u6 "Th|D)%b#zBX97FV*P:h2ș4NRԍ=b_2jbQZg3-9/~QN~qG563HDA[ ٯɫks@V;_ h3RUbPRTZh2?Vw~~K_]|]]|ZvQf߃y+owg8l4;kt^,d,ӊ=gk E]&o3ԋ.qwpPn%uU=)"u>ƀt*//*pDH?\\<6? wC~߫_YAu:WWSQll)l[u Ю_z턴i94[7r~sW5l5 eX+9td`!R3QB K փ(gԞ3Ћ6?_|g)Ns*eL!P^y频S)$)DV$̺-W%Bc;V- JB̀I 9'FkͫAE.0mNj_mfhH2%?{GأDt_l.\`A,ʲF3Z+=,H#]ifWUYL&(bU^Sгr8]'k斞)>eV.HW"XL!Z5R9 1naLW%x25w6!sꗬf/.r{Pq1LYXpwc-Ij}aWhVBH8΍o0߽[ϱ|RdUB*&%D bkIb X@.ZX3 e%%;⒏#rQ2f AZy:Q. h5nCXi/TRPSüyf|>Rp't_%twqa:N%}tu>;GJC_~pH ݑ ,nҊ5%6zŮncp;LwL')MKח[nVA7?O;YV(G=:a7S\>w3DΦb~tw)aa7s480tf[~ a㛮 0]wF aRNIx͎ lJRɝ=[Pis:ɗ SDܷm1VG8 5W.1/_X޳={FʈޜXb9mz9ثLpVwbX>?7F;<=m8aNVgv\ִ 2>S 3߽c*wl`5 t<b/*/+6]{)__nhN|P(!'82&9ʑ ՊcmPpN1bګgw)N׊UhR Pe#8# 8Y\0JENm‹_2|[+ _8Ay.}4|:F8GBG7 &z6+/gSCUGD"A€"l^WHo,~|>tJpcLgl6~C=<^268g d&VCTBBvUt2RhtY_΃w6_/- Ḁ,݌.zӞJ.~xq#CqrImJAQYiM襁T5]S(C䉋Kv Ў( qp [6^{/I؏7w 21'HD2s&#wwۜoK=2BPQJWX+[ O$1$g[O'0!J6[[ŕa{դcQ!`SA),U5ؒ", LrjIAPùj[g>9EB|]U$ 9O&O>>f1#^}$!.GueY\%k6PZs2Aѡ} ZZ ݮӚH3 ^c.Қ%ƘJtJZ!Qpc*c P%XC%Ve&tnjwoS]ȊqdMh 4$dT|HR&91kjZ%m X4dFH^+YGDth[ &<5|?@5,=G"]Q6~^U|avn5eFYiJJcoejKM&P15EY"Ѓ>~wޞIn%k4q^ߎqU}˴tkqƘlddW6q"ew"FtĄ.Ϥ1BhZfgu- cn[+y'͎3gx۷TuT".rލuH w䮰YɬТ{t1dy}T{ ޝi:] @ߺ(QrBUXkZzbLx'%a/-7>c^_|![S8;<~<ȨVZSS&dhR%3Άm t5iuQ|eօQdmτa9E*|Qkۓc+.^\K]~W4_R>Z_ьlĬy\>Km]kb pw]iԦTΗNu;JiٶQ4DHbF s1iqY=i\vNRTk6'^L\:HGk*)+PE皥FK%FNi;oΩ5}v \O`9WMq{ X=͍ %HH uQld !jlQR9P/* `Sj-f~ l2+T1FKC6}F6Etҍ7.,GTiX_ӥsE?*>R<Q1|6o,$&"X4fc+no4d+`L&(bAfSгr8]'k斞)>e$kԡ"XL!Z5R9 1naLW%xg_ ELZ?wZ Ӛaml-W+AKÌ꓾%RY@4y1.lL [ /&6 >F̍`*@(= 5}LR ]e.%kBM)%m:, _W?lT'JDVd"QAKNQCJWY C1TbEQ9py"ےtҵ%SV-s@(K̺(ht;Y@aQ9Fg̠JFel5XZ [gڧBIg +F9! vEa˰2cñlK`rr8"y~Ki/}%5lZ/=_:Lv|[l54*o3ʷZlȼV'R>kgRDYMjLNf-^})Q6 TPM֪ɗQD%j┃526-pXe0[8dmamFmڽ|hxM x{8C&dȁXt$H&!ZQ7|HY%Frix߲>lm87ÆORŸ)qOYq^;,1uiL^6:i"-SY Dk71m1*e] )D ЊD#1i0N2b H:^c$[gkzQy;:a? L-AA[.#|MF,1BN!XtɁu1 I=q'oEס7F:6E5))5{gq 3('[8uޏOHzңM0,yt.י i 1麨% ruԹktoƤ;X^Dccf'> "xVl,)cdL` ֖خ "3Ҕ牀' DIc,'Dk⑦30 Zd$x@2Lgs NKŔ%ʞ[Nƒ=>θ7B?CRԭI&U{T'VT5'#t"ٰdz"xP bie Χ`#ep}pB?{'tjdϳHQ/?^< ZN ƷKϻ@f"\}(0lQX3m1bR{2^斔[x,co=IQ C< ߶Dy&?1b#7cV,xL2݆vH 1Suˀ 9o%:8^N~ys+)0{M`'#6)Rn kIIJd; Z&e}V&e9h(8F 0dWߨ Pr X+wHbB\ʦ:ut8KLyl-o0I\'H JV:ma3A.y3&tIƗrJhڭ5O<յl7XV }EGwT{>jtq^N%qs#RzmI&fǣArKnU҄bmuT5gFd˭-DZhuYF'?|ޔkJuR1 A&A€ʀYr,q\8-<_NLGf92 lmZ1ki5lM{.HyHm,m=7=2 eN"I"@0)QࢷuX<+hvFVڎ<1j:i4 )NlnX1 @m5kGׁ ُ|;2co u1^ciʭdB^fdr& dG}&#`#%] =N!nu%nKFoiP<$ܢ#TɻHk^;%&-sۡm m0WA^Z վ}ڪV3+晚AKaVDEݒlPYk[vEֈJ`Ӧ {E5ONszKǐmDhU2K$( E/2ltùɻѩ*^1Y7B!.tQMsm|[c=HNR(@T*Te1Ho.h+ -N[!oJZĽj]Ouv 3v_g PwGtA)~,|>?¬P[m e5Ba: iiMNzPJDhTfJɐ4C", AɤB1˰U dcJ`0D4+alqBXB+/J:"nW)JL}-ֆs3P"x28IZ[zU=UΣ=|Ik=|(RFXpHC,a\jE3Qr炂wۊw1QJ~:eXP%/7Oon16~z~OS܋&V=KWJaNPg%uVxN;sqg/Z~jT IzH8B 7U{ܲL\EFiէYmNg^FEb)]Wv R {vU]6MrW\hǻ./H-l5gy=6z HJ;HD UuHE31cmЏU/=[=rmURu\ԙ;|c]>&i"I.`:F w@9ŵ"ahx>ޝA;?߉qW|G.5]٬~/>e;˘sxf1pB)jP0Ǭ4irp]Y[OQ81eL051VgDZov$+<.sގtz 1ї\ębT;}mS֗uA%$?~uu=eBH|]pY2,d)ԐZ 2ѥGh/ -Qd+L"r]AOek𞵠rO-.{ۊY za9[#>+bjpQ\N0 |<(M늾~Sl0i&YiIҕ O2\]! k θdk\ƅ5vǟM~Z]>212`#؝f`Us"-ǭO|?Xn?}}].:[4! @?du_еmkoҵp o~~[^+rCh{q6$׳ $BsSf( OU|Mb~԰rM\n>Vz!uK~@bSp[p8'bA?<٢'g8v-4aק|wث A^EXgRE?e08s~ٴα3<B2=Q"C3HB1yeGK%)H&*DhcG&nMlx)iPgcȑs"%4Nb zV)7(ɲၙcJ1ښx)}}4ey\=Ǖ՜=fRKU*8Ҵ8kkR 蛣@λ`ӼݜHk9 6\y9kfD4RTB IyLiۅs&1#Ak(%Fœ(7 xm$}p $2Z9E"fx)BU. x(❌GN*.=FdȒ%CF9J, Z$יȳt"LwۚWK2~7َS$-H,V@h7BH61 hE'UNsEU9,{i2-dL0A8aڭu"+ fh%Wrpn)ggXعs&m0 *dL;0tٖ܋: aMKf4Dv"Ц/ٻ6rtKf2UṠuvIUf)mIJwk]_A )u4JL|सBj`SI54 oD,VXt/P% vcN#(p'U)CGr$GZ$Yo{ Y3 ׾m #ɻvIͽZlJV0"6^pvɟVk`ϵ;jGɜRmY.JZIRǭBW<(t~{ӨzziΨc!J @V q#@7j0z&Wk'{@q\>8{wD 2b0u`݃e\Fb-pVHf 2T/I &_^\<zZ4l[pH`hנE][v{0/uēW6{ECْ""0΢Ot"/N3u(yT*#ӫA]~^cC ײ+.BI%$6#_e~ #_2YK4UVXoԁKB6Nz%=|SM=RѪ7u~qS=_/o\~e[ԭ@.,ܣD`>i;uwDzћҷpྫྷyjI59{٢{|Dɒ{qK! |<SZuSGwoZ.rN-rj]-7 顃"(Zzn!llJpЅCoߛ<#kx 9ܟǹ N&ln|oCf& 2 w?4- 3w}!T窛Uj֝ClGo5LJlG ; а@U j_ANxlfEɼ/[^T .nZV4.N7ӛCOoO{ C)ppXBiO(G)֖XaQ`'!őw)L7ꬃo@vEPjyjPgTSNE TE+N^@o$-[I˃Sxu O^Wld\j^3R>Z g\v-G˂ )'0_x?qA46sMh19 ʭrB#ሗFcHS)CpXf:" z4&hYT {+r)k t2Ftq:G5`)92J#"( -J)|$%RHDc足F{\ =O gϓ1^l[s!3ALBCaU/>&܌GQl5 X bΌO5ι3,_?{[NU4(eV1"HJ" 'FeS +}$JggܴBq>7*G{߯$>ݝ03RnR !P0H'~XRaD!wֽOd KDt;;NΠUaAU((>QG.hb T"D QDrV66q9"tfeF ûE؞Ǔl:b FYb6uۧaS˴9 @y>\MjFsB$`CZ/9ŠhCZa-QlGfYxURli^|0) OAM v[P ٗm>k3OLUx/B#o ]UVԣ(or$ۢB( Kl.Ld&A=8_8 k![d;[7:tQmGN<ͧ_&AeWEU۬ ^o2`vY%Ϋ*e+߸;uher4|bYmq ோbVyٺàvqP iڮvmylEeJrV} ܮ2R>X9te[-X~ݴmUf޶QwGpM'qv(Va9iӈktК(zx3ffΌ2࿷# "l(4ՔW?nZyn_e`FA\5q@?drs<+d9ϧa|~y9C='PGdx{5[Ƣc\Wcb%`DwH/CvU62rI}A?d"oKub>&PJ/;2 P[iT%)~,‘Ҭ{Y)\ʪ۬\xf~;jSr4^9+$۱Z%T]aCLwǃCz?BSi<74Sr%1$RÁQ䤔| 9ԕ R+R3EOe6;˜|ei=eG٦c)^r# I|ϵ,wJÃ&`]P78Ŝ9s˭sb#VNeLʘ Ho^oT4k2\,&乀(zz^䔤3Y"ֽ6>K*ZRY\`N^VYE[PnJ?*)趶|%c~3l䞤9m̿dbZʖ꧘dNCNya/*ÞS\(9){n%tsbgNtb!>RhogŲ>|* :misDZr6^$1z`Yw7+H5PLkspp"^BmqV_Kʔ%0Bv:e6=hO6Ls=ombowyVZt.˳ d*ͻ3Mզ :lvvV>wE/uL8a?cHlGyKE٢%cIQQKEv`[LዺiVm|v )F25}Ұs\MyrY^K{m1U8+dn104r()99׊s.oN TxiQ$:++y yB"nu*mzvm[f7 Y.Uyfd&.Ц$gr> V$,~WKyuMOTQ4x8Ԋ>> olHyh Bc%W=hN.][sEl 3h13rt9r}kvoZ87qv8d4mn׼a_AQG1p>-WRR|~Ԝ-BWJӍ6C4N+)7#Cnv`e8UMR#d&m{%f#ahICm& H@k鮟O/)}f%h(ц8*IT> q.[A{KY#K ]1H>AL5zx uTI9cS*9k#y HN@t UjWg.Ν멮bkkyZ|3aw,qKsxgەlנmwS[J)WJG\! kBU2@4X XYeS]?qlUсw嫈*"Ff)BbKGEJh5gp$@A3HE}Y8A{eF"&P&r-D Yh Ҝ 4+v8u1#ȁ-/z@{V3 /&JiVyP|m (w!qiMR (S)Ȱ,ل{-&˄'ZҒ, UN 2}` KYQ9s$9g 2.H&, fZngIAL21jDNܪ?[ΊN=Sacs4(>%Y!BƠ2&m2 5 bhhz()/ez=y"NK5o)1yTD]ejZ~qI6?2% @ n AFCqkeN"7 n=^NE{QDqaY5B˺!}6MLĴ֊ZhhLR|HN!N街r-Yu/z~,Wl+|FlVj#A '$2hxxt0H!:hiRؑ-dzs4qVȾ! KWzwy zB8T퐗Rv HM>S&quOdu|QŇ( !&n*PZ;.l2w+]/gk#R"C v 38P=Cl!6z~گ`R~E_N\:e$H5BWJR jW#YM9 -k27^0-d1qjAbA[T*HM l[cg48}7SR>.H.q=Nqa0g"/AIK'`ḟ9=lwAt秫%v늈N9YL`tw) 1>(hB>",.S-U<jUKP5A4B^&+I1hbTD$k 󚂓P¹4QK̠H(Ƣj! 4e0KmК3.P TyVX1cgbC^c_HsDbHn=ZwdwsWj|<*gφ!_ BzA I o#y[ Uh4,.CW)x&EUO5Kf$gl_z'Eݭb2+J/&iiXVI"ũu Yk[raye="Y  ^!j-^3]A>v<늌.Az؁Ws~_:0٤q͘j<\4\&fP,0 ۩| ͇c0 [mn%?pe@/og$l!cnRc,3IdlҜooXx)jgg&Jx7z;ޛv;gnxA0 ^3 U^Kx4S 3"$E/I0eE ::+p(A2ԓJM] I8Ŝu$$ODnaĥ$^WOQ(6J˒[% £UGPudFJ9G?ϖ.-Mw̖'*ll0/$$ii\*QI+:RYбϑJsHciC'/sM[fCQe!F^fQc:3!>W~ rl|2NsX$)A'mmT%3)Pk9fOJJblGB E O 糋/x&.a=1 &ܖ4r֗~th2J>R|Ip!F{*X~=g?~j 2{Mr/=`&њ)')$h w*R⅀\?bcݙpD FD$:h %70MTsa# Ձ%~xKeΪh)U:"D+3&@ݒg&EGQ0w(Tɓҕd] dzd*^b嫷7X6EmQn+z3Y [/6S XĚgUBx=* co9(X{rb!|0&М`UPY&%uPS fYcĨdb0@P;K9! @K;+ь b *h 'BC5rM,ጨ$A#6vy\ywF"da8  ^/<qp_8 mqp[(Kn{l4$ۑv|Ox1N]]f7W6gT5qnZ~p}[k:ek%᯺UY;'S5hɍ5-2wef{{6ObۜL;ʣft:gQֹOv>|7D򽑛Cvt[^ݞ֮<*:ܩ[/=g{ ݂nCwJٻk{e0=qhv{>P%ʇ V3hvu!*cB(3"1n+ x` ?+)JcOK;^.h)Q3C9,c&**@ m1ژelOKäH# '88 p. Iqc-M J7,Ύ:"sXHR{ * "gJ3HJF%64hf-2H0zj+cHA,7(&Q~ J^HO;O W嘄%9P</?9Œ#+pQDUdQC4\RbXpF‚n. psՖ_w!&4ts@s&SR"Z#e4g(EU"@4:PɩNE_>jtҬ3V3(@m |g uL<9JcnRٻ8c Ȇ=d߇!?8~D0Hb}TS+.wzfgjg/==WS;`BӋGe%|s+m"5YU[j.6y ]S26t(?׮k{U@s Dh,@CA 6GDe1aZ\oB O՗s:{ӼL(T^U~6vrkܫKW߽L'đ۾)ax۞;ygVnm<.MrۺfmXh퀴iQC"}tzGXS;O+jD@`Ō&e@?>7z?Y({5 ٍikeYf LBF$@9(y4 *~@W>%vt>ҵ>,|1t#ZKܼ牠WНA\z~S}:>?Mj#)ixJ$F:ZKo$A0 hr8}:#=7D}\V XQg)pos:%*EB0(,Jrk)n2nOGga [פsvm1{}ms43gkm^[{$'|TmEKŤmsZ#&E]#^xW;۫n&꫁c_VsgbL޾sv9Jp2;M~:^@|RVDtnI\쐙t]4))dQs-}-X^gZ6N *|߄J *Aq^E"U^9'^QJ@\[CNnj񯠓.T\/mdSI{ɖw;~г )- E5@qB`I I+8SI%bAAC:dE'*-#Vy [+c'f5v^kbBZ97OTh3D0ϱEJ^J9y :EiLI(@`4()V$Ĭi QlNLޓ@ M1YϢhRNCTTqG.*m5fXF}m)l ` wj T32R5yKc:;;vrs YD47^ 78C4E5 gTٸ!=3βJXr;blb‚*H$hS:nTsn~:+s_vq_m [m;X^7F"K>FI o)d9K Q5HڀI*7 dȈ @(@brUFc4]*Śs=lI}_>}=-"%e-zE,/3MӎzAWBPZEH!CL[#m0fa)~-+a %Di&,e;FD+LȤp)YJk-jv^Yl.vvq׎xK<6QDP9 e">H:+A`8I. 7{gw` ?uApC i5ߍJR]_pBj6,G널!:D木A-+7jl͕$5 qQ8罶{cf;}$JA65q d!Ֆk둒HIIv'F(ɢ" jNk%=;D[y0D|;(NCc2)oǯ)M5_d+~REükU5|8e`t yQP|UՓ/竦dzoWUEm_ݟV*Nf pZ{/Z|S\W/\㪗z[e~|PyNQXYz^mDKcO)v|t'a2|;}H/vwɠ9u^5k{jCAEXGq'Yv0/m6]UGJ\I/:ᤩ 4W1\W{x82cΦZ7 6ABtRj~oPىï??L5= 膩k[nrvT- h銔 a|4C&5#jиeIB)98 iď?4gHyN`l2b;/E/Bh xZ hxl~mRDm,~^eG@5GĒ-f~ks_,ꩇ /[Ip~av9MIl;\ 1(Ws?|!jޝ&jJ*ɭ4e|f3ô˜H 3:9,蒌T;ohZyy5]9VJJb0ƸD:EN+ghr#.Ly1r.Hr^'9GslKYN,6AHÇ'h-k-_PKE@,SN)W5DRTAN!Bdc.G]B:!Hj;aU@"6h3S)A-A'lx܌>#eU>]#Njz7$ M=O|T xyRNn7.:zt3+>:8N/ߗ{ex`2FYSYN Q@]ng̻o)>pwgwؼ#g/Od<*% i!ĩ\Q[.JYŜ`L._ 86޲&LM9CX{ӞK>{2(TbGkp!Qa"Z2dXDR:.yR𠢷Xbo$|8ڽluv0ODa80%Au4jfw\-/Ǫw1!jEHw푹Dr],]& 9?L_WÀW]R;\V˸+ׯifÂDnpn3_ f/jR&eCZsĦ}AxlL.ZE80 LȨ%H,h'Xr(ܙsgwkS):.b5z8yͷ͗ƒj<^9hˬPMhHr<&TQDR31$x4zeơ8L"}*6&%&.uvw8| ^o@z"ҥ|gQk/pbo$Q_L-sjL/2_ 9E\"- aaD@7B%IK!P(Q-W!F!4ilRL"XLIDz՞J1pL b͹(MM/\l$eTbҮ ܭ[?9䱽 ;]>0 JI"b 7]S&S-pj"R [dg1NRh_=O\1xD5i hlg&xIV=OKR %>K$9}k1IL!A1On~P?mt8Ō.6_;m-ktثы|hj+žr?_W^=Und[[dQ.Gz[o0+^oPVVԛ?tUOB& Q1+KL@m,OܸnΦU*K+?R{ ""?^N]QnqgÙƘGolWnY6v۹ϻ5i>j[T, w=M.p(ojgذaG}(@Cl]ko\7+>b|_eY`f|AgXj)n9qv$KR:Qە-].U< 21g !<-jb⻮ߛ2+#Μ_Ȏu?}&oVA^dQǍ.e"fK[rH!R(&i$ Xu]O|_ *DW-JHN eZXj[ŔE>s]D1?_үȜliF#w7wʓZWkPkt*JfԭR :KNZ:Mr>?;)'b5crIJ[׌R8* Wk (*)V!Z-p_)L(`0ÍM)+d7:eZI@t$)SՇSºl&<Lԩeɲ8I я2rhE0_MMJ#TAtLU0#&?bާ\h*ciͻCsEW{! QD0^H#]zb>ˋӘM]QF-[鐬CB'dɐ>% V+2X IdUY:QS#J̱6s$*JIۢEI2{g0^7J?g>Aލ %b'X5E_Gqu~Ja>kQ.9{xVYb9kOeFM*\ J{+[lڕȍQ#c&E ˤ)ՏE*A+ե$P2L%Dt ?q젧l d9%P[0nEE D'+sVkaz`mՀӈEAh7kx QO/Cl]H2hF'-s#[M`s2qfеO-Y8R?SISyYWa0X*q(JlWRހmx_UF~+ ջZ(Z*aUSs4uqAb?թJ_JAIh8PZil,@}eQ8 aΡ ~,|:omBi 30 ^bB\"B1kP40_g)*SN0BN}ȾfElg%oɹnд-#Ȼ˼ԯ xE G^#K^J.G[ -*``1qL΢&icYN~ttpMPZ33d iPyU(}p|RT+W0m,9h<{*ȋ/e;76T l*`unV$ 8UQ٫wFTRya[&K^V3!Hb ҧU1l便 R;YyH! p9X.p /opEՒnR"2Ex(%g&b%scǂ@AK^/B>Jڝ2:h̊MD0Xj5YҵwP?ZN ГH,"EcnXTЙ5I̔HRd'J!2~\ w`Dd*p(` üO(@AV/&JH'r(Zsl< M}t֞Ew֎ip`Dj3˱o R)BMVK@g`n%X@@ HX6 :u+4U }aȔZL/m0Dub>žf1HTBmEѫxdm f`<02jaP/ZC q4S"q&d98A=Po3;kSU8]C7 sʨI8Y 4iT$S((J-0 UeyS;ˏ]K}uE#B@xRފ~Ԥj?ʭzh{q Qƒ%` Ł|> tp}?7ӳ￿:+`4r"!"*AdX62qov>>U+>$%1(`32H*+`@V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+b%U㐔@^9\a4OgX *É@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J/V S萔@0ùKP@}J X$+J%rA+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@_HkYp@Z[vfTZ'V޸NJ@_O(V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+%ݭdé[M{?@sew?,.. "]ۧ(9ZEI818J]aQ^O]lgy#a r\ @N{mz9|,H*{5037)pe %ֳӻ~iX wn\ְڢ;xZF ?wǧЗy,Zp$WRo-ސ{FQO^w:4P vFjC*˳>&\3huViN9HTU}9앗M6QUR(0ц\Ge~u0}Mni+zL:EgdN Z{eАW`bode$IzAAޝ7H)n!y(dv@󣒻$eɡ0H_ȪVռJUR."&0(Ae!;N{;zv_θ L`Wrq~\_O.yoq~~}+f֎}V|nfca;yק?K+" ){dw?ڵ[͖t.A~l9Ƅe$]" f*cdžǟv'cv |(,y':Kϛ%-i_Hc9Ri7xKAmYݸ5ƺWwkྜྷZ-Ϫ]&;3XЁ#D?C]/)E?H9FG}VNC's[aׯNf|yk顨rBj̗U ՙ}!`-wMO~8ٔKXsg ]}~HaB'Plr†4YMbwNs{1a&Et\S尸}n<LGXP߳oRJU?J-rPV(=o1zH4ޓ#q\ Xl˩,,#?U^~LⲞ-X\Lj[=)ROd`q7DŽ?B`SԎ%7 KSJ!3;KXz )Z V>kJoaP(C[Քpk8Tjqe?}!5|AR17U:ToH3dHʯ[WHcN0C=1ysor+.z%p:PS ^):06e(5,Kٻ1%I^1z%aHT|IXo&l?"~0ԖL 8J 殪Ffs OcMhv.a'\ߟ__v-ǝn6Kxe=R( Dk"I t`7׉,>OŠ_6/ | 0i 6:[˯2o{{wAYe`JAO<'#[ q8\F^ }kҏ=@pMƍ(*#zw @1.Ota#03[X9wyʚĿ|o;f٤~+M`8(m3ջݧ4uzsOg̹F~٢ M\N&CLَ0;F۶ !a"Ӏ]xb>3:1M͊57jiG-MBp^0?VtWrc@?~ @icK? 7F`xJS68 xl5Z,]/1ڹ Zc$T4׿sw!6^5q)ٻp1̇pt iDFzJaugOmpmHBTO`=LLS6&=Piz$R~,~Tzgow)ԕjat.~k\Jpȥ81zRH8Z "AIa J*QDRMblH9Z=ހvw?߾ Yz@!4_4- xr/rxK"QUQ2+c–$DH*WAa -eJX !"$>F\V+J R"bq)ưPs2e.JwYZnuԚ{{ڜ'M),δwIJ1.[HGɘA[+eL!8"/5Fo_a(1*$-Kpjp 8)J-Mr CV(Pf2e9ŘZg4%).-QB`?ZEl$$bJP*ihA'n ,f -9%,yvD&MgKGdwKT)Fe2ye;áv]aO ^7 &R7J% en)yA=;oU{Z~PmŤ RBa$cUZ DžXY THDyyYnJ9Eìo&j#%އ͜M1"%qV\"*]Eah6t!0Rʒ\Vy8fjc\t?`7Am=PvPطO7?G볗|~̐&-Ф|^|]l[uyWZQ}'4*PiE+2KVTkR`f;hGy:5kW1w^.;xPrm(P^71%GojMQaXb$eV<:kN9`Z|yΣ,0tT#0 KAS,Dڤ˄(- H:bD<vXAJoMv<) yW_هiI\!*@`piLFڜnj$S%DMPwZ E#S+z[C㎪dלgkЈ*EUV+¨0a._R:6F n(M4ZwY ' ZfJmf_ک]VL=uL ՊaAEUJi+.47U#a,h D 99/daq7΋?W֧1jI"DrQkzȹϞؖGuݟu^mRHȣƶ^E )8[1m}ǷU|t[" ukm9a11 jUԠ\11$Ӕ!?FAϫH[w=k. b͆Q\mөI<~d%IT5Fg<,n1?ϱ ^]L,at \ #^'1l^gHʶ'"c9F󖄱H;1pU4ud('Q2!TR`o*Lt;ؿ̳si\$Kt ]GcFzI{ţI!KYRb4@橯[uB%wGb5s)n iy_ұ1,uxRsy&S JW1!+u0e,D)WDIR_ b_Y"A|Dq{Pa S,b"RXT AR 5a䢎0 9u\Ȗ KR+?e07-7c4ZP&<ǖ}(5.nF ꡣ OhgZMD: iۄt,שmf&H.uVV*arS?|WϩY5$&{ ]^viM981%h=8?g I#DP#hlq{RVVViZ1m!JJP\"J*a1zEӈhl%itoHWo@60JdYeݲs* aZ r9fƤ|?)Qm~y^|:a_3Pl_LtHR <E|S'owmwYɈ4[{gnr 񤀜+$#7WJpH3Nk@PMAuZy-m=u"*C̸}S7̓nqjأ麂 #̘r9pT0u&wvO6oag]2(>D ;O&]Gwr!ȁxn+RUx?Uq}f[K3EQbU9ߙ)*ѩMfpxJ0}?.=%{m҂h!n̊_F\<9.1t'TpʁN42Kh؜zn* ~;Q<U\xnt(BD/xzc珄z(Kd/ ݉Pl[4 %u% fmΓ ([{I*@VYU-(XQP_ A(Ze77|6v{P?A>C:іߕE̓Vadں CBVi N?հBO^f0`iތ`Ԋpr(mU28e%c&{?N |T:sh]GOBef2xrLI zɚ;Q~X,$(܍ijRlvpݯw|l_ d8)TXMq2CȑV`e[{nS[Њ(NL-1 FkaUX8D{ g <Iѷ@slA @iLNTL'7#3>^K:/t#N}4-rREf)%IzKø˾|&ib͞z2/4ָCR8?7ه?Vgx.p)qB)XA *LE2t1Va2*T D5=,w-|޵y# SnQ#?%Ǽ"ԉ!j!a.9V'{k߭:{3؊ =>O[h?c*hll]zJWʽ4?d~ږS=FKO $Iz| >vb?`ʳ4{`2311y}}Ao?Bૢ^{wyU%Gg4>Fd8hcg"Х' B# a g2^co2z xV]L<1J L7o&/p;HoyAc܁^^^׷.kW쳛K#ssD0؉m-1gOGi GFڮ}ZhFHگt@?4_Sp;lBGOXkT 8Q2)qR%4ƧsItI"Y,-E}tf^vDׯǐr)rD)]PD}"8ւJ;͚_=w8=$qjgzב_i<,0;Hn'8ݿ~\i[)Sٍp:H}dXQk`9M>QJ+ଉ!Yu K (2iBjw׎y٘8.i]LL(.E䱷kn6$$3%A)딼z={o@S&tXy|2KNnYp"Vd.#%f~g2-ifs NsBq*e8/`  vij_E}kڌp~8q 8]KgD)f>],CMUy`&@L2:jJ DKs+DS>5w$P5UzNDz`ZP**9yqT3Pr?lj+̪O#PДҒRZ9Q/4%ˆX0u@E,ӜDVdHM%I K[M J .vO %m'4As72[LP.`6'a Pw\( v+>.ݷ%q @,b@aD?j)p 9݀am:fDT% ꚧ}?m FJB@Zqp04G=B4'g_ Qn~âlxm2 ܮ%aw*4 yBy̡~K L `T |`ʪ%ɦ]*X=n ݤl_ͭ\ w7LTF''Qwn +t%Z4nW-@Hk>?c l!\m;c E5`)K&o:"qEŹHܚovj 5ա TՓ(m`Vb+Zw:3,@RsX99IBHAǏbn$ p+W<FhtT `a-ۖC`*Jﻶ_)_R1ʙV{?b"䘓urNJCKDCq1Tji%T`d Xsk*>-h^n6Z-@Qj_],`U8~m;u>M5 gS9Uysc㴛6YwzEh]Q6">'Ų9Qe(B)\\}.C|#v0x~ ega[d\ei}6jk?u/9k*^2u- Z e~-v_ФT˟.Dh2(8ۮB*–2>ja JA@<{wa븅! X倝?W/",*m4 PɄܝ!e)JKAϬc<4Cd5y~.1J[738QIYm,^.h;@PQ,N6XZ;r&*^RC5&.xQK*U5ÒgȸJwCu|2;A )8SGZd4:#>13mZrnu>m/.\z J644o(IV;'oܑ5IS#j%d4gfɫѩsw]&)%3*j^P+/g ;dZs7D00ۄ\&N,U1/cIA"sQƸBhߎjM/E5~b8/ aE,]}qJ]}| ]ë 7i%,uZ1:RT\.ޏ!m"|<|G4H^vE]vGeEBNI(e.D?= R+|- fvU\FPOɼ:8s>5hed) K7P[!bif 8~qnSΝq}?"@l#~ QHrF{=F=*I2ODž P6X}k\ 5d0]tD5tt޹3$!5ZE WP$,[Mzn5=[9@ Lڵ۵@'.NOOgtrScp 8N:9 E/X˂gUx<5 k1JK ) 3V<݋&UU"c8c WzUǺgGMaUG9Cdw@:>?oPC+Lioa8qQ2g+MYqxLɲ*\a"cd8`ԠvaL)ʂFku@} :v.̡v*#a}) 0 Vb컨qIwYX2ޕvEn qQ\)kw$F1.U$Ud(D5{y-y o'WV >U Mmv >?^,Bױ4nzˬ61Ks}pd~y~K?qE8%tŕJPo]L>W .]+m*0V`ŹuRlK>nIr7AfZilPjěAރ67ƒj'pyڟcb&0L>y Ɋv"áeNL뿁ra [L\9ok6|,d|ݨry ]%OsHX BX 0POڑ@.Ҩ6CIaOE׶@Nw O:ߘI3b.uPf炓$G":!>k^o+I4ݒv85L5e1'\~[ג(}$!md @PjGMNBk[2Mv#jB{ݢ(C$yd:[%9{_]p<̻K?oF-RJ:Xy.}u} ӈPdXʩ~ uNFZVdެ~;PhѧKԊD #5 f8\$K Ո}GDTB758ño;q %Ja/=6yxZ!%5eqܔ8#Cu4#f̰b`Щ2m!*kU IAVK7eދf AP~D*v|lYĢ}nWv";$F1#ZBAzz6n?L1`ys wIY]a~ܩB*˙T%j QKa?V뢉 ,+q<#.z ?yoC9Kge0q سL)A7ZvOIJSQǰXsd"j0޻v19lw"JG0Ðdz}yY Cpkq%kRtF>hP|qt{pށQ7?h[q;h(NEc_Ծy^(HgҺO㽁Ge9@+he*-!@+wK 0 Y%F5SiH)M:!"uqq:y.hݹΦbm5gJuِC5уynM.!bڌ0)ʰ!Z[sUĺo%0eUAU肒C⠰s6<)Dj`/'./I{R̃%EE)Ʉi7!e)JKAw7zd":_Qiq 1-eCkӲ>1,!i4;< t%lqF(Bgk5Vl eGd9CivG@Z777p\=*V_KLb mi\A[`Sw?m٧VK-mnc /.\z|bA4)Iznm$kGp x-j̊P2ˋb:+tbyN|tjUrmB C%sB-Qd$ZX}gY "(2h0i! xCrP?GYCBgAAk;8̔~i[V(,lm5(*މsȝ,R+|[ fvUsFOɼ:8H)L^2='8%}17acC jۏ) .4ingڃqځ\esfUXqB%p2:4Z>)YQyQ\]^P)f/t\y* ۯE ?!QRp=(tgرeAX_dKm|o/ȕO=-'vZ"iqْJP{ڊ?'!-2\A88QǾ=EP^C8\}.Jܖ X.fS3=1?(+|/} ;6e8҃߷â-=O$j;bX)ZD.=*jI6~%w_,&Zh,"Ke KuqHec.Ut ҆ԅNrpTJbfٲ&$Yvbm t75&:C:# ˔]:mβ U2;gCb$)^!  F4 whISXre: v/C5)Bב ejk/mӧh ൨PgԭG!GL%L2+aڢ5iqq,¥ʇ$?"7)} s@ҟjAMoмQ Q'^X*ݶ,5Im04d5a<z[MH`1}Zc[<(: `}FMO7o"rZ yQߢYC<;iMGzLyf~|w!lR G ?J"p_~NуǮ@ Zpk]<1!Pڤ>0a+ DSW;%l?`JN)]'Da$8>,Oj |3".)=}ꮘ7sgWD谲5 $4PEϷ*ȍE\TKa0.Iɡ(op,׶;nEU; pMuX$16 AF,֛W6)8݋4vY]5_vfC+U&ϛD嘏0V@! 1pkON٤ iL/ly̒'䏟0ius(]=!۬HtXƃo^ 4.t)*tv%``L6UUWv_Z'=]JJ\`Ht1\G$v@P&e#Or,rϘ))KYH- [s7XpOڼmUExoQLڍ^6 F٤z͢ZI2`iIl  Y$";g:Gf'.˯9*|?eR.oXI$\65.$T.̩!ƈ>X6֐R@L8]Dl,`ߺ;+Dv-QѿrraLYQq[lpHpIgR1͒ϫ^#T mnm04*J9HaDEFSJ]HQ)C01&G(ⱨJ0;vd`o^L(`(Z ʙ(y]a7;LmAb>QVu0P9+J7btp`'ʼnDsW#dss JF{ Zv;vs,X5@ү"p)Z%rn)SzC]o~7;Q6N-"1X}7'_ͳ/Bx,9Ĵ,|U,)Y3-1}ta=kX0*X?~4Z÷FhmCu&o"BINx|ֽy`uȂBqq:i%Rdzp% zf?S!,pU{co EIErc}vf~)<`qFcB\\ӒEc];.P@1[nYdQ'ly?I4atQJ}onȄF݈l*U D5 jآW>Tt3x1(td|Mvr9[|+i N D|3Gr3).flW?A5*q,N'ckyq쿌g_\U s#Rt쁿aDod3\Q(iK lx!N܁,9,*yQSlXx0xM.yYhAqno8'4&E.@x- ](єS2m:sG6DR2uzsu9ځ܍ALSrᅳ0Z)٩6=eˢ߫ɲ_~sV}ƽUnÁY;1V3'0ّHwt(Nd쯖*/rjFL),UYs@Ņ,iB Qr^|,I>K]jaĭ!I_=o-lFYbuA"(*h8PE6ٗ݁"e Bbhx}I-o.Xt[bla5\ƫ5FJ89J& 雏p]W*|~=y%BXu OvYNQ!db$9@Q#?ų-Ĵ6!̋te`l):[p9:j©Om_N_ߑtwq*7<~ŻƩR"Xb5P}x@iXrbl~S=*8Iktׂ]&L"=+edZBp\s~3 XR3n3,e >{iݴ&H4QbqEu_#FsT?oY+FcQFK8dfwem$I4FoZy0c;bgLh$%)TRaQ/3bҌG!3׼U@kD)!On"]6cǣ:f$8H2 qc$d8Zbuad)"R{ ^)p wVIl Zt=7HD6%wKLRG1y2ZKwa-KP &c֡(2DlJQKɨhq݅&?}Rr˜P_z&e 'uϟf΂QB 7a!IV I(\8Hr`7и?VS\8iA_^cSWC*+:餮@Y,SlߟwJ"unVD~6\.Qn#v->=S73eߟH374nhRt,a+phv>gD SN!k)%Q2d$&DI AurǏhNLhlY/{V x!So^jV{Gb͘42 ,lCh]8"_.!A%j(7P/=^n"K8% cLLY/$BM 6"n148;LZm;}=<Jsհ)M,dMG C^y0,kri8WD?!RF:|w+N9n,&VFP@1aK8 >38:amцlΰ{p4Iηp-U%eIba zmw`&hvI-A8R AG`U B Nꀴ ab3% _ANBRA:%MׁGNl0jbK@pu[": 69Md`h"=6$iϷy;҇tv|4(хa6iނ1A!'09SVSev4()_"EM!<"\ޓS(E ~-o0oA)6pGS(^;[&sf]ZRQheS(o[eos fePEcr0#Xϴ:j5to%澝\հ9*?/XR!}k#TN!MF'uTe 6 @|q8W Z&X3E5kz5}ǂþsY5~/9aɷ]p'caT;3Z&兘&5#B8:H㐙wQ"XD+r ][)߬IRh ŏ%LrS{t_nn,m`#cTwlpn*Tť,'{̧ ^A ɐA%1kZVNڴ]1F0 FC_Jl\46Ş-?Oe[)#֎S.[fZ,"(GHB6辤$FXHҵFiiKG5ߝ{wt_q:o"y)ҹ3#,EDp^]6^(֧~ ϛ feȄ .SgzP/OS %ϪӸ xܣY0jPC\EOuw7d8${e},&"3OwlӸI'ȆAok!# F&  CB 4 -h,%w3 mgh`fOFix6ug:Ǟ&ul r77=<,6(orsuk>[&rr`>ϟQ.2,ˣ5*a{Rc]\RtwiAA\ֆ# oIL&a't«BV[W?Kэn!xi?;axN9"d6WIca &R&"6k1J] {y V0{o%@Jz_-SV$xmWn|rt:|$!Rp6g_r7ieD-GV0ԕ⢛]RS֔i;TZL0~ ɩnJW+l4jQ'A TSRC{ %1=P,K <@rgTtđ# FP{<]a@x#Bc=6 F494|XԐkkA$R U\Y &xe߫ D(4!)60=|p815Xpzj)xםq饞MZ,⽤hXb3x!ԩA,$+̤`_2y|z:x=Xpy) g o{ ~*?[Pd[ -;eufהNRlIBvJ AbbK\{NY,H"q`Лdׯ(W!U?aS}C5xNo6`E`̫o3^-3a*`mi4>_jhGo_JyvQ8+!({]GW"N\~/^OUaԃSQ]#uv,ͪu5>MfFElq ީVx e(O;2`붠aJßSW-HYoі0='hYl6*9*5ך+۴U2Ps<~E٢+N5~lE6eck1+gny ߷Ad|)᫭c*CS#Le]cv~au;.,URvv:ZŦ  i1G$AÚ짫{4[[~_t(rhJ1ە]px@TnpnnF>wXfS t+(ƈtN]?!֯e]G[6htP R4%( S.S h iWq2+ ߮?nn}믟Ϊͷ@L|j_]:[<;Qq}8~4Z' ^!>R?{WG6-Ai`{~F$tX*`IRuYr>V: /zm0T#  #~^b4>^$cg6jam-]-9΄4<*ZpU$MCi#×xJ#:}7j=u0Ew&6NKW| ǤUTTqI%-B"!%n}i(nUѱG _l:OΔт  qBx=$bNǕT`os戱nB\Ior>֎h]ůDQq :;/? ;p[u;vݕHP2Mtl21F\g,f 9? ]V]\v?ٗvAV{f}v;5kz,KbEyuyz1Fi 3hWur7GU+"b xbz 5YeT؄|%n{`#/w(6d3CFv| {8o5v Q[GWnA5+CՈu.1QLdȡ ، Xc17kVw+C+}=m]~fZGM.*O6.~cu55k x⵰vd\t@rts.o_n߯8 ̵Nf|qprn_??UzGkJUŻ)?65QO_4ca.4~TPxk@>l21c:)QsV! pG)90KNX~-e^]A>rwZd.}93]xy|y_Ng'|y4ʘLmi'ӏm4xc@Uf9gH_AZ7&pV _c%ޮFDhcn+ şyrגc$yu@FیD*x;=0fFa cyyq.CDk^F,^.(ON$w嗋WхHZdv}ZHi6G aGȏG'C$*n&@fG!0Q}3e~ZG]bpkS7.Pgjиh4]S/?h֬X7DHMjyXh\ܨ/g2oD/v+ࣴ K2;..pbROllкX .)ct;4HV,f Nx<GN X"Ă:wvI.%7?03TDwЫ'?гKa1'@N{ %4>&fԜ,dJ R:H`Zj},ݒ :}+n 7h;_ۏ՘b9h3ѦFA%pV /mB ZY}L<@rܸҟYcW5G6cn3ixP}ZrW['tx0!5m!ߗ$Yjgju(N{r+~CJr.gVDVƫO}BcWE[#XmܫNZn;5R#f-2ࣧ lN'H5k&pB{.lߕU׬U!V-4bKĚ5vBwd~ǘ"! &A5Y;OjɲfZՒQKtǸќ6֒9UQi$dk֒7U|OxFrx: 6nR~-Kt_ڠaÝňS˼f8YĉRZA_:"xI,'I R!ITzLȗ:W]Woŷ?F(~$(MjMμ A3g&lB{3LFK'y9GL Ak*s ~$$-"@N{eIC:` ,; :}[Gwf7َl:I0M7r=ΊY UݶL֬dx?͠h '3(JhY]5kJ % 7aF 1{nwR E ^\ecܐ3' ne0UF)QvЫֺȮ9T;#2F v$">beo)T"9bR"ZV} yP߶RHeN= FnV+2P2a WN @ʶҫÝq!"N颕,.x,.| ߚo檤'k?+m@B|Â(|.?Ϯ>>ÓAkNNyH?uӃY:x/R=G)Ik.s/zXC~$/g%b=FpZm̟Agá!qR z7`@%,'߿r5>s%XӃ¶jjMK05cR0|Sr-W\ƐlPݱ=#j\ۋ摐G:l3^20ܦ~3 8S;b|+T'1 S˜eEddV߫k|_˟Q#slGޓT>ݓ>:|&rWRH01̸&kP`y_% ET/wR~9"hQ epBz )ZfdB;ޞ-3Ѳ dzIQ8]M}5 8/Z5ܫ,AL4LHY+1Eb✶O'(&E{Q嬓 $aGR4F9n_&r@Dv $}H9j@4-uCV; #I 6a: x&6$Υk:T]4C&F$L 0Y#񛓰yC6, ?b;;7]WC{ŷ? IjLgDHڟ]-tC&bai2W)eAd!_@^D0IHnXM7ZHn[ȪAOǽxb`4K!1A֋{NͣF)+br|A.ts~oԍMFS&k 1%nI*+pH2'egg2"'YtF>ꦌ{潚(`Yc*'I(iw&.5IF-7!tW y2+`lmLښ52wmAY g+m:P{=] Ӈ֪IU5-&:55pRrG>˷-p%)V^59>W .0Y%O 2j H'aɘJZsFr2!eԆL92LDRW@ :d=Qh fh9Yy*$ &tÝH> mIgrX̉GΙ22-`-89i( Dʿ}@@\>89Iw2ۅM\< Y<kM/ eQ H"-B~1v"7f$2dQFz\z:3 cOJf&:7dgwu!}z@;`^e\HL!]^d&]9]rrN1)r5hي/s2#R{RT3/dyVh$KNyE~ZNGݐ E2Zn>|adW"!&16*Ѷ0"AAPIⵒmѰ`SMB$pV|"e9B}VJ(+r.*]DANG=rbdx t$^,yAb~[AI;6مJОR5_ψ>r@R*a~52936mog_ iL'Ϙ)݀2~D$-J0^?{Wȭd1/ YE y7pI<,?3cy&Yߢ,YԒcKTHVSUХ t5;l 喽oMl. ]*UuƤ0kM.ܗltg5F ;j*":4A<(\5dOM-HH8LsCS];erGgF af]X9o4 #lG]x=S@dm^m".[vdy#%JFDʩ3K:{Apd!j8>NZJ] z,Ćbbb^UδHAzLk6HbnX6BM+Z.ɔ,P8RkY:t>N駘 n4\tm@lON~{!>iom-9%RZtkJ;o M`>W~q1SyK,[p[?L YykËnfj,>xy5-%SHYt͵Of`RBeX,[6ut"&_k̤}'їZW#ptt|[5&O\UDQ/+4:ŘI[@ @Eho@0O'9̡>of/O&5SMul[1b[8}>ћ;XlY|ntemv%WdnR$=~9w *p"pKtA31T|MY34yPMN|% PaM1QPɸe%{ ]^nL[UjdUk'Xl:FޱS AMcЄ_Saȸԓ AR8yyXfJUDr}g6bm2%m,Dj14 P [FkJE.!`b:Ld] !h𚅜AA,EJN IIbbXmbks΅:JYEcp93o2Օo7uo qAs=]ۉmyv{=d56eZi,wگ$ v`G6q7_P$|VB~^hwmy wc^ր5={Vܾp&9#}9w8?We23FZgN=QmeWEQ@O+G1c ,v܈nF}& 5Q0yHŃ3a[93`d=7/ ֶ:><)h{YKٛ\xyu9}0ň*WxE ?UQV.>z. !DK.p_ƙĶM=rۢ6go LN~1 r-z|};f?;p.;$C'U#^LEpݘ'A^Vث}}3+In_ڡaRbcֲzuUc;V*·dE\[kd-Z0<-XΊn-q'3 @.Ј0>&okx4Wyܕ_tu7) 8ܕV:,㟺*`\ {\:?Oy%ċ.6 y~hbQSb(:%Z5o)|ͥTrŪޣF\5@\wc "(T$Z,q$s">|9@Z7L>N^ه+a <<.a5 ~VonmQw1O =Ggmڽ+I]ӣcV;t Sxs_Nc~m]6e9)&?NSo9Fsn؉Ws֐k L4WY04Q E緝;կ*d9>z1"-$a=P<*tΏyj6LS~1G0x~qsx]ySRf8J^ҽSJ\ESZyz7y_y5y<H='[ / V]`%kzK.ڇ<[M{N? "FOK;gs>m|3}sJd?[y6EѼd)e.yq|2XNISE_9Xt 5DYyEtk{{\-i=Pu!E,՛3A ea4swK9/?\0;Sy:b_]o4?:H`A1OGyAY﯃$} ]x^3cq*X*6ٽ?U23};e|tϕ].vݘ'RLOr!"@fd5<瞍Xch֎T̻̣ ly |၀ LI8 |=dڠCSWS ँt)iC͑HŚVwK͡C+ធ=՚9ɜE.;h;Ij@2h7Ly~Wrܾlو8J<v<|\ǿ`G>H]u9PjƳ#ހ}"DZDpġ*׿9al~4u8kLM_Mu&60'_s6Umnpqa4&_8_zli'$ehRPAg T / d=@l\6;j* TzPJPoG|ľps 6:Y>ڬETVE,ֽFH<J Z^ ɠXmI%3&]՝)ָ1Fg6KHEHZqFZt{!:mlё&>yNtBbm.& 1LB&}鵚HbYUx vY m#!{,ʮ\m;Fd>RֹJȵELBԩQ_~3#27ͻVk3)OTMqlP*αߪ|]D0Ё`5tU;k\16wXb`)%%DŚRԾbcѢo"tdH, AFI4C] / 捗گ%#)7֪ ﭡ6 f|QRbИ{A7[pYn,MûSfrFY6S'9WDw$Nf .A^"eeя%tY Ŷ X];*1kcԢȕN/(HɅř]k'K2NFJFlj"S_m[LE,;`d(E>Yjm+P!!q2)ʔ;2RL] Zˊ#^[u.HY9Aq>F)F B0zzq~M)D>Fj+,b!O&m4 0Sx}cwrܾ$51o A U#Z>Cъ "o {A-}Io f<<"} n_^]KpQpwij+a- '%a“"v(BÚtT֜}'d2OtEWA>T3T!M&B*??jGX QY~~jD~|G?vtm".[˜-$oTh yƅBqQ9u5<ԎYgM-u(م]ɮlM"j2܊>2S?mr~^#{]olݣ}ءw;q"u6ygnj6]We=ɤ> -2}ҳ.0)M}ݘ''-ֺea_4d5|(1y?u$3'mK:N/4k{[V 2'ɇTKLe$}D_i#j1V u*rJ:ln) Ů%]!6j,B":A _#:FAa@"|'aֹQShЊ6?ZpP0Bsն|HD8]HsJ"q].#@pl*/Pt'8obffZ] BɺR$ }AY6+DVo "d/KQȌe=eCdfG4%l0 &+BGր5j boqvڀy nd]1lN,Z~喂:!`y\XvujDM`61Ӆw]?ŵC  5[6Yj&1J#h\]K{ǎ+f5wP<&fv%"ɦd[|qbIf7 ::EO)sRbSahֲL7A.D}̒Lv62aTK:.rɥz2 lӥJ @!-6/]^?./]Z.hH%1"I\yۼfP5C^O 6*bG[%  W\b]cTrLڧןT]A'I%)U!sq(;(2`"nXsۓ|1:cOA*zʌ` x v' ܛJ3@̵z!:5jtDd町F,J6h \51}?D3ժ/S} HiS`S67UW٣tJeV5\ f{sԒEF= g wB]m(לL`*Ɠ8 >i "QJC QPˁ$ݹd4EKHP1,+⟒4 j.-K>3%7Pr}BakvC:[/s7xn&G{ʼn1kH E45;{`4]{^]HFi|2<4[ {͗y xӅ밷ƪo=z}ͣAQ &@;j=kZP&sXmJBł Si;!(6\~-dERjC"Zt9[ip̊Zv Cǎ֜%L}Z`(J3'XdOMAnEWloK/ِ8易g_O1VZ8 F׋SUBFqY]GKI<.@oYKjFT,RT7Sj6~,*O۷;E[I!lr" Q= {4{**9v#uާ9[ (Jc g٧ȘBzւ`ȥDS\vkMLY[냙19 absPe } 'n~/椐5Qlv[ B贤(:s"י_AAV h$9B rm]m%0/!5C09S{S iU޼|.x{O< >-uυ5Ub eH,kbiXz)zG*hl(9hp[Z[+">鹩/ΈqlY* $Gcc'7+SĹN[% E ZkޖިSp mF]D5>hA >a@WF͍O`j NdjLY-@[-6RbWј{'[ } Խ?)D@Ǜ'jD(O1NmS!C 0dcڸ@| gjj491:e'NAȡ,Q {Яv;>nԨtI67-XۥK./zF3.OW˃ƽOZotN rW]Ҳ}uůHXIECt]xa6| vclw{lӉa 4wѓK8g[! ?qڇ+Y{)RXBV9T PՈA: U[l'-hM^ciJR#enm6Ν8%)woQ((##%bvoHpL 2OJ>6czUxKGSmVQbhCxW+18OO(ª D6gvyvmGsY_wW~ݕ;ao(=p瀶TQGvzmqr4ZQx=0ٔQd ߅pQ GD s<Ǒ@ {zϛY_ݳBwZ{wi6398{[2:<n 2zWӐ11 K#(Jmߊ}fR/8L ש?_7`vǫJW׻eQs:_4`@K%1KYªEi&{g-AbN#=4{XOBxN4g = N2;][r:Pѝa=gA9Kw8¯%daa{M5ݜbVc\c1_4WXSMki7[7Ξ=6S=jo 8(vb 3JpҞ~mrNir˚&Gn1&Gȑ[B" +=ڄ Ʌn(ްJVz4F_ң0FOYC{hD =ң}DVh@;Vsf̲yr@ndBf~=>C;F:O9{cs1/nvC;s17MxaW2>6|mG%KzWE ݳӫ!_9c})JW>FwH7, ]s8UV ]ܙf27j+7 SGK6̸Uq;8#sK+_orWo/|IP0y6E~0|iKC ]FUo`+J@3i!HB+{j&mY O4Py, VhwK jޥ1o|9շmRξN#LLf>1S?ڹuisԛrm\KvXf,Dի/iG#}*_`NQF_lMjɍ&ϾM0M{kPA _uk0ڹk/ Ki Y7ג !8M>9Qb1iͥgJfC94a3wې*@㎔st1>#Vj ɼLX\R[O-J̕:,zjo&cYA~48axEW b2`'[ϙZyT-.qg\Z껏,!kezms˖fE{' Ob;r\At 6wJs)Om@ ڍlAXAnhLJb ۶D8̖R$T> H ~jӋm[ jj,Ώ5XXer! Amx^Zn.ep*9nC Q "~fGs F^ 5Ex)ۆ0k&]f4s( x$I $$y0XO?4yh5C4b4DZ{G5􃯖ܽ9,!p^vP~f΍Í7w h\~h1h먁 :t]qƃ(E,I_oiSo_}wSԨ@ O/s~ F[omb$X'98whF(R/1h~l\>^)|~1oXbTͽD'`\([kY('gHF2t@c:kP[LKȒde?Z˶RYir}4s7fxm(V45n\, 1rw{}ov< ĭOquqz@;{_^+>oW'>tlٲ8D؝F< 1bQlShm}4RAO1SZJC14ynJ$ [8l1TC+ox7+ݸq4+90L%z4w䮪EZ0)5c3c߻=3]<2 9Xʐ3tƶ-H Bֿ?뙺?{Տ/$᏷.I90vn=JUUƦߴeb$#/ו%O=koIrEЧG 쇃m\ F|/bOY(i俧zH#r( 5DI#, -՜鮪C[]L Cc*7tb-%v4&JA8f&čXA-=_jzq1?l9miB,jb GEFsmCSm$7J~b1t1x/8 KL/ a/HwBcљ\hS2%V\J,DԙLA\a)t5QM0G rXLdJTғ'5Ei+!VJ٭ r֯3v5Yepp BbCF#Z+7\}&Х# XB}q]sٷ|>9Ϲ@"Nٹ ΗGmY\tFORۦd؄WL6٫*I3{{$;XOC 8ε R֒ېy:{RI9kkỷvkrXM#Z͇̾ iC40LJINg%ri Sip~B&]]"sgeDTbu$!2Cؽ>@eȱ/grdFReĸYBB>PtƤS2AdkxCW!ӅeE8&^R (inNt0=k2*o`Zb#UZ29Xn^q(dat{"%'6N'{ b"9F}a@m֤]Pk&Rf+;3>}ͷ?ibNMlu$2a>IYͿOG|G?_M??_3L/w[so h p7GߺesWTZTft/HKWt2C23ծ' r̿F#ssq>&-!i]РFҍcz'3SZX[ݞ4CBfπܩGl/?$҇V[8Fj3֌ϼ%ѨW4QStba-RϬ:Mc~zg5wF䣹% #X3`a^8;p&DRymFj{|:RP<^(pPtvx/ Y|\F->F.U}NXpӣOmu9w#5Crli8ւFN7tE庣 Bn6M7Zq9rlttRl[p"$Xnv1r3(!|yEt0}\׀Qn :Tk0ּt9eGnJ2E %7T][uWN1s  Wu8fjs-31ӟI!e3HgܦoT)j޴a@D-)z_ -jɋ@%Ԗ l*Z6R1}"B\9$ &S2:mvŐMK=p)t5$;ie0P *ͺ0f:>c`Xgy|ݳQk6-۴dh](ˀܘ!Lt\)9@71{_*SX X̐cY0Οhի[:TtHIB;}YfF7xiEqGoT$Hиwx=zFoTS(Z>ŰbЀq@`ZUn@ '';́Y7;'ͯZ*4ق`oqmc#kUum޿|ӻC"HQB,q"2'p.A઴G~(<@0&^Ћ^" $+y"e=a'>;Zݰ@ȗ t~~?r{x0o]kIn\eCSw}p,Ϊva=)ct6~hؑ;8#YR p ;d^C"pf5ẻTD {]}!;}mT9̠$r ^aH. (3@IĠsIH?;?VpZ ըYOVZnWvo8X $Pq5itnC@m1ng^2J6m`aůW5* YMG\G>-CaBV= CEO?g2`Ãc]#*[Y[)PE*r$twxoNrz ͥR4"O<)IsE>%rCGp!! imr,0WaQD5j8gc2z *U@J^U١ǺF2SL=J3udrk3P4SYa!("Ѵ;.G0xYQBTj62hEn~GvyzJ]@rɨVD2vРB(vjM/[z`^//UX{N  8ӂu:dLGG0:9VBSƱ"U\~)aL qmH%#Xx>*;Vs!MR0A"cfēsaS,F Xq!)2h1XcUwfҐd{Rk.t3 J>~]-UQś,TW oBǰ 8H+} QR~n }npiy&njk (:ԄvZИc)ü%Su*]u*jPV+ W؁Vain51_*{2?QA?]M:ɻc&֞-DuˏL x(a#&9UAV,$[Omn>xǶ7+|iu{fL@]'ۛ@L{_;l)]%{]d qV nmYFvtiA6Ku2ឨLE<gO`R!<]gIu`oP\*J 2Ks"s&+_W[iXbUֲ갗 t{ i;\#kJ6 RCJ 07,ctp6|ֻxxQz+̓h[A DMڼ&l8WġSGВ" _RRkL1!#

s^Z;EQE&g `0NRLIQhs0oU2.>}^J XWi_&0E@&/kt+BQ0?8P0;j3ʱ᭴4@Ɂ%$)ܐ!e{)gTe Q#"Z[KɈF) "YAYR%`֧Yδw<dsKTPf tQsJb/JRM/Ԗ?kѴc)=([Z&<4S8Ռ+Hʅ_JxFkTΑy>Sx{~aEv ’[/EX:a;RtڃIHbJÊ6e05`Z(jUa.w4OABG_?']u{kR1%Kk%ABmVd^"=x~nG`Ɏzgv@k]W*L*_8]CLW M\|--X]V"2х; ŽÀ  $-YhU\,Rp3̝<'||xhFE] IeuK%Μ:m`u&dl`LF$ e:%XQfdPAąTKRnYe۳ᴁH5݆Pfـ w87O@q17#f`@gnn:PBN3g~!]u3;l`yu6 'EqMea+y5(:\?K'ǹqo TD0BH)A{r`:¶K?*,enǾn߾}={bDth#ypn!L0K0r5 7ij^Y5aa N;`L ?wLd|{s;y3v@MԑVQ&}]}kHط* rG݋j{  *TFWiiJ @-XK J=QDÌNM K=.$$n̐lC[)`Ŷq+"0"`JƀGTEdc4ŰRf"q-,@͢! 0-$A%F،h@m΄Z~6, N I#-#Kӹ T@^Zd>[`1aFc4Xy`j2bp22Xɍ,+YiKv-ق(t[nͻA4QyXֿ;>P j!GڍQ1rXoS8lki҅B}K#6I噭[#4G7&z[~(#C>F O nO޿oFKٚ>MiuXd S"lAQ})hcEB7Q&D!^ki5|ɷ?~ηMrCsA9r1ϼ)6$V[(";)s(뛟I^f'8I Td:I i_KBƞTݙݽI8YYCG`dzhQJF O9SWcdI>N4<; `]8Sƽ߫MU{^&H\1 $! (|LF6nB҂zR=r\ ?@q6~= w;Bc%U{GJ{W!|b/bC޲3-a6ՊZRnQĆ Mc6f4g_A[>٣~ ԄQ#ӷc5)AtW "+UF,FB--5<&}<^ob{ k 7Y9-@^n4$ŶQ#P/}A,8[Oxk_~Jޤy*F/K~~Y \Ť@B,O!E1Cgegeǿ[VQ1C\݆3".Tnd(ڹD$Jf l~&[Ѹz_C*0FbjJ&$(MkUY݅*Lq/Epp%  N7k ž]ؓ {vaO껰U~!uD;'G Db |3eSl9L ;ټ(` Sg*YmQ}zwR>9 6tb,4} uK;9%oC;ɕ0b9ĭXF k͂6ThM3M8CEw<) :Z5t8ʧ}KTlkwpq$f4GO˭Y,Q$,K`^\wR09BJfJ9tLܻq)pFn3t& T<_Lbcg*;`l $!j-5b8U~Wd$jGTUsxib׊ Fż?AWV.lNs2)09-s~4HvJ~O㬩UU?3(=\4yt;{嗩r;۠ :O5p1ëdb5;yυ?AAMX0_DY_<潿,hA"wjpYuxzl|X\>IԜ_`I7>98~ \>{_$Og^hd:/n/WhusCG6<.zx'9{ ,8wŭd4ǟ`M8o7rip?2+x@bTANz\Tbrn&U(m^}>λW/?,zY>3`Z0oZ}{8h/iyflKi ܏矞8rVÇ+`W{U -h|4~-1P{:_>EyWǹ޿Uo,?_c^Lڏʘl4rb HtcuU@4§W.^Y4:,tENڂi4XϬɚ/%VzYTRZW dAǟp~ΩW.\:~Xz;nrEV7#WH}f ŇTt`$2w:x Wͷ 5皿$.)Bpp_t{Oɫ)mkWjjAJk3J>8әi,Y&WgJLFL8&^e'0 ʌLfތGfɇ7{3joFͨfԺ"k̨&Ŏuf!EN-oY,Foo"Na efPJ0߂KE7[P{io;5{io;mmArNM*vRFTEgSշ5GIǏ;d3"ȴI2u79x|e/y]i$Y@iLw. -,äu%Bcȉgevtqpg^D0XDIEOs09G:^+@K1{4e-9ŤR]ZRUFJ0_w%;*aS2:ŒRzR TUv|Uk3d-ݐ+FOf % ^֍_/|fȆuaN tGc[FNYfA N#g8Q%7-O@-jb@y2siA\AťKKdž"BD.fj`5leWinZ֖S~ǸzwEFZ];F;dJncW %S uweOxF[wINVbG٢˲פ8#A㔊6.T4~|Zy[˘IiS$8B(# EfgkɅf^)}š7K/_>ި||; %Wet4ʐe%kiGâ7q|cs\ωZ7*;w?/ )&Ѵڂ9*=e\:꾘/){,as5 #] Hu+ΰj@c] MJ{jV$Uc:IɚNR5z5zLii}Aqi)44ӛΝ `|5Z4P͠em3Mg/3E`ɳ{nSLZI}_=`ڿ=9/?o$F] Ľ2 :8O4 Vn-CDu1!|9nYP4P(gI% n'~OBwx5A7TkN-n /@U9)cOAƭZfJcP;D\@-ȅ zsP@.0f(kC[9䳌P!5!.srL|B;̹,#ffL*gDLZ21G^ Sp. T8^WkǥSRSƵpz2+wH!`,ٚCaH74ʹ 1ۑAiGvdP}G uhCZ}`|uR\Jt}̓v/GNߎ=m5=SS4U?$Inћ4ǧHVowyi lEBxtKc!yB^4쟥Qb*h-sv^|MxO+K^`uS'*:7O4ߌ*>/W-K #^|9; 0@s.v6ormJ* /)r\kU &0-h`q4L cq*BAڂp4MC4hEh#lͥqIH@ χXc/|نk;[ONakeخ:cY̰G`=P΃宂1a,6bnQw/SD2<-biaZ@ed`'FlWQ|sS8AVI .J#^+z{^&۠U.sV:q(CtkQ1􃱵`ZN@̀W:]bmOcbb5sip,.26tKo<5)jzd/j} H!вV{B85q[Ɍǟ bQ*. eO1FOr*=uҾtfn9Cf`B%药J6A#a]P856w RiD6O9[geO-apȯm͹w457:.My TBYm¬f%Lj7鄡( GGYAߞ_} 2O8W]fb5/ -"pk}{T^LOo.ŃN=f9O-4N[^.Zp@QX#GkaqG ٴE/ Ly5}F+5ge˜\Š$ߗy].ҮzSHZ>[ K쫕ٱju(MbeT':`6 :_o[&A#fE w1VF9Rv'mbH~@7'b&:ILDr2wS r@@3X h:KQ'?d*|k.BíaƛhkUk\u{H9vK_-7nЁo_yrpq~2Nofp#ӌ" {Y3r#݊vEnҏ!mFu5ky;`=*N'nr'|!Ь!}pWtoV7*W<;͵&z HP`Fo(8>H[bQLR ֒)JqL$A[bGRI5 Ejd+R}$4]IZ̢)SDSh Kb":ZƦ K`u`D)SP1 \|xMXHEP, pB<%, X\;〛sA5+%L1P*B(}cشoΆ/h63|M8L-X2aQt*9v&xDKƲ^fxj1#1Ig8E fR#&\qR+!NYR|JEE6j RRRNRV{Mk9qc"R+Y1dӸ^D'1R>IOe Ʃ qA55]J'JJ X$ɂb`ϕD:]#WU4:rBÑkʮ8&#.TY 34 tN0Qz;.)'"THBkvQ; D){ RSx <, ʁ˜oH$ ɵ`҂k%f&?!J)NeI61KKMDfKlDW)Ktq7vDEBRa7* yWgb2ⴐply eG+k|JUEsirFaA|űO# O^:f: i<\i@9B $v˛STȇX19'1{ @/AspU0e֓P0$E̅'b|Ydeƥg]F3ZMfMd)EѶ& ,Z257?釫'K, h=ߋxnR ĒI, 4ߞ]2OiϒRV*KilMvJ#*iA@J@`~7p) 5 XY(x!.7ukFer($()}v,ݮovM\$)kbpw  ˏ> sQL8H=nWq3Ǎ<"HIܑrӂ2ti&C U^r!pFõROjQhz5֠),LxSBwm "3< bJᄖ9|2!)@<|K<Au=(GF<(6[@].VGMHOvGd5Oj353+,s3?u69θWwOgIDZ<ٴ{b"+GFy!tVgG's|z27ݳI}y}ۻ祫'㗣 yDK%\ |}ss[|zO|,r4ޟ~{n'?/]\._=,C:(|\nؕpp f0٘[Q.,( ů>'L~8d 9^̳k <TɇTP414=Y"QxT x?1;ή.ʬ.=m OMa 9<~K"N3kѠz>"yƅO9Qq )̷,^N44W`=XL1?h!CO|ɞx0}XԙArʞ3l8eɾôLƅǓѤxPdC߼'P ?ɕDoXN`cތΦ/sq}M PiUkx|\6~<'\WʏZ)SN,;Q"#`bl,yɍqqq l4q7ddEq101yx vCX&Կ̓T(mp=X'ۣwэewf a ilJwZIH`8ģDIu = '9x_oi>{sc1JˇNaPg ZmԢH2>s 3Y4 7X  s!`+c׌k;g0KfZ1ZY^jFFMU&0<[VT;xN9}[0+:SϢŬzOYLUY<=sۓ-!C+OvM;@R濬}KQS^RV-scdS%!aB >2n&_?F&݇i5 Zځ[^ T|훵nu(nFI5\s"R܋PTm*S%*WݷdVu(nFY*"U'qFsBƉ/X70 LzLyLxR(B<>+5B^,Su#eȇnKX7s-v:[@9vXDXC.2§>!GRƞ#>Pnx}?T$4|EG{rFBW9E٤w翾 =q`qB׼D׼ҹOnf"$U)_S `Z5])N.hsU P ɫҬfGXPI $E|yJu,R>|CK1A|Pv1y(R5-Tֺ}J(T|F Yxhx9j V1jlТTKHj˛,a}ğf.K$Tf>„eN\cn%.Mz9Qk8]-ų;ӣ8Ipgn[ ˵g.BF\L|[g( PF%˶>$X ;cV-ksܑ݉℠6;$3DvMnYꎒC{QRکg!8ǓBХvS0UZ2V"]Vl._QO.mB+e&0ˏ}D OfWN?{o>qŲdpR\KBC2S|oP^+LqIH}gv9R%):a?! ,=J~ nL ny7և#t^+A<$<|Acl:e"ƕ$TvF6 DV5_ٕ3PtD珶IDm߇<~1:8O\ Et_F=j{ד_ksv8Dz \}!<{gv:_wGDznop=_|9Nܐ# RcAm!㋩_};ܫ% ooGwƟ6Y\҃_³kXTҟUIK[+Af^t^c(_du㫶Iwk0NK>7~w?WhfGiO9A:/z;߳İu[p\o&(ҟ__?^~x;ހ)2AB`zfd%TM/O!!\p8!{31=^zGGKS"]~9Ϗ?Fi9J 7d/FpbD8~ƁwL+0>.bU-ހX+`?|= b3=B7fLGi<2ee9kS6gNLw F( c^0ewxM,Kh.V5(0 DGqh}39 S=79;2$>Oc>U?9bp[Z]>h~:*'R40BMd6q|P~&o&*_~-Q > l ɢ94^ tl ג_^O&noeGpմgad칫]ũRM@ÐHyc<<H8| n;lj xZ%+-?><ˎ1I!UʙaU??KHCVSE^wߺ2[ΰu5t@FߏIE_w`8i$+'}3qw gџ#jnq4:;:ɝ[><oL~=w=E%7K,K,q+[G+etԙ!Z}pT~:10v|$Cj֖!&5>ζk8RT mK4a+Z)\8YT2&1J2X_ "Nǹy!4tE(\I,Ea[J_Τ)m6'5hs7tsȨٜlNel:7ٲ?w,7 <O3L=((W?) ٟ1y@,<)817=Ƌlȼ^d̿6^dE_l Ċ:r";Jp , D:YLB>K",\} r &QE6^.^$aA/"/2O£o D M~kxC*|HG2˜l"3>e'$/d Yr;I.b(xҦ(9Yf';FnU;l3ۅU4!mL)ECA0U<^/g#YԳQD= Uv%B-y ̲#@lBnW1bA!9` JDW=J-CB=uE@KgCoFBb#CWPc7 p ҥJj%1F_0+4rv)N&;۞e),;>(Dby\P(ܔnpe,4DzIEP#UqK(DBͥ==.->f]Ŏo/Zi%{j;̕x99QA M{ AD88s6]k}x2G6#Zuh;F3qt0Cĵ׭鿈'a({3.n%0 \@`M3@JE6ʗDa%v{h|$ MI.J*%l{*&zwÝR0≣C*/4ɶY܉SuJ ۵Ũ:g녇另 ^ bP317\Y Iʟ*$d!kv\M?1^vE ٳvC!Q `]ltAt? 9s3(p2aVm [51QxD1Z6)[[~2oVdS|MZQ6>DuLuau|l.{ȬeWZn7{9~fT[*D?z޸6OƤ%I(p #',tcٌ\l;;Ho踕nۛBbhI}`hJV֎ZV.n]ےW{I@HiW?s+NKjGka]2`vaju(ncpιR*0 r=pgI{8+>L2Z ֋:3dElUGɐ/&eU%Ddwջ=FHiX̥_Q #FEmϋ;RXKc[c!wo2 { `R*oxqn Gh;B1BH}{ ~8 GéNS =v U$)pt}+[8c1-_]*37D7K hL(^(JDŽ ݙT3$soL:E'0z0l/@ $mL)>(d WИJ*Mu"8R!Σ^_v "jk@nwB" ELb[ѩ+s9i9b.e6Y- G[ׁ{gOu@(y|ePlE VmB>v:jn5y>ҳyfeMY+U:TGVn_VƢ: %~M*KXqboQ nT*DX0XoDF|X2\gJ;lo\m*Vpt9`S)l)mF2\Ny@2٠60kAx+WfJ" mB# @lc&?}v2YsNzSo ^Xl꾣yXP7<<q>8b(^*y;wi`3vQKt3㾼p]θ g|$ZNV"ZZe1 z]ϯuDO҂>|X5Yv_1;l(4emoGnMP7N5hJk.5luWWvC"óKo_;< ;ioTY9׳WG5/j,>GR"B: l/0o8 JSQ&B ɘQYJv\^.#[RfI쐛%,+͒ R1pfI5$ #L+XwhKPkIY۟5b Si)aeG ~PiMZ:eZ$^=,{JwyzJhs剂*>춻RIrK<;sg8C{*gt_=5fz +N LOaE]SX#^Ul_3qe $QK؋x cr7s#+hf<&ޯNNkMYœH-(xae֬StxӀҺs/;{3J.a40qraa`-;pkq9;9n;;{ Mi 6G+Hց[a4m)v\v%aYYH&= ӑ?[GNF>=7sy 7'?-~= ~o))\AQƒ?sCxVDxhp!I2c%u1Έr+A. sv j3s_'~7_s0Xr`XYSg@Pb|y&2lM? uxRnr  [JEqNMmWz ~It< ~ޤǧf{tt6$@ʁPL}||2xr:OgӿuO\ O1]u&0` 7_(_f__©3>]F>T߼]&^YvizWᥳ|~r\6s/g9G-E&}տׄc9`h-]~}q0yh{o.Y\1)\og}-Q=l-g>-e0c\t;@ L1A`0cc0<}g<|}3(+?`\ 9Ax Эxptt5Ǿ:LI8o@B"|*܊k0V'0- ^ƯRqQb0\/S̸yȒ*+9._>RP qbG\2 ?^WZ`IJ'k#` Wjgr'ؽk `+2`ETZ w-xs-9ʍ7GK.EԇvjEXt#k:DpcXnmWם`ʚTg#sj'čGM H`"Bpl9!DF"$b;8*;)OzևD5&`o&=H`|9,>#9j=N4A61˒ e$R&Bf"ABJ=98GZ&VaS-.wá5Xi]2)"S',)ަJB-:\$; 1 )E l$IB7Rz1Z;˃ZԆh1X# RE_#%ifT+ Чx`Tz)uҧKa,BR(a 4گ^kū, 2,|LY S)͛JC$2̅D>SWLL&.3 |Zy䥁*VXyO!#E'Q̾MS^쫌zB!e}}mf_w@}.k6X#Ň }RQB 26nt 4"AN;<ݢibmb"ľM9;bA=*wy0҈^wMƴ yI 14=4t"/%X+.ppqEV2B,>OiѨ33D<; ?a u9. 8rخ{`b$i7*Z]\9*KM :Md8qHP00's7fXMJ1R <<eRsv$2`3K5TQ˨L%df} wtОl ]$\ޅֻz.UM6B>6" 'Jg&I9+O8|¡(q*B[¡c>Ϋđs## +ыE"" ұ:QI ٵ6gfWA/CJXn<#^IT?yn#ihhZMz&ՙ#=NRMX0 Eab7ī 6&HYEp2)HR'hUj]&>d 1&㶖GkylayYkyGkyTI|~q)W PGRN9K I,RaERd **&m&YJƍ` b(ZZdwy{\ za]gIܶ+l5ոfgTj+\ %ER;nIv;)[@{;f\tq ~V3v_dR* I?&x+-X7oyyG7~ xpWI.yӟ{s֍\3hEv8v=佼*0dF i5($’~ 7dgslᠭ(4KzCǐf`r%-f+Y J֞g]? PC #: j̋u r(n!P V2 2J&>p9B9 n~kǽxxv>Kf ,X6%p7J0I(@ϟԓ $%B>MF'1NhK{2ۢd1#};R:@Z> QQPwAx8AѤ9kHP".+?}=n<)Lr'ѓTkIAjVidJSIş rb!J2G-se|° CBDxSiF61Kcd N8q`+Qƍ߷b iG1%:#IJbT+e"3zJKupQm$XTV$%%چFlC-Uܹ?%VK'8f؆p)M `5s6B%D"4.8R,jW-2G/,ϖFԙ5AOPJ+P?BaǙ`^QpAR#!X ;r)5GŽ!SHdŽe鍺5"KuH)Aѵ#casV#^bv[6Pj0hHp(o%Xޕ~>iXQ֦3r(}*u`.aX0?S]EQfeI6Iz5Nn@>@?und F`d Fۢ5{9d\Ej)V+5(IBfiI˔A'Mڱ>wm!ԭ ">7`ںY7Wl,{-6j7sSb(Ԧ̫{\ Ub lD~+R ;8eT G'TN)6^7OP -`6ᐛv07?aD gK5埸1 Mˌ{H֙ xR+hRxsX^F6|&ߌx.Y*x\e5b6Er(Y|1{x\x!"}BԚK_gx݄](/*4 ŪSM)*LYvU|)]iOLAU|=t:P)Hj)k79j<Ywl=gd/!Prq 1R?W\aOȂ+b/8&4VBXqP"bUaK 5ϡ@4W5{ޗ-Ec|K޶vb;*w[n'L{$ 9P)'I鶒?mWKS7F }:% Bl8_|kH9\+K-#<۠5xVɢm~#?]ƈB2{?j) :y6-az}Dx)a1$ Iq,bXxV{ȯq/v/. iH)My <5sJ46qJ,e%µm{ET"[u-wns7aB+ weGK\Hٺ727Tr+`*Aw%o'_Ȏ:5CL#>iZ.-5;Ƙۥe(of rNe{A ׽#}x(O͢E4=7=|_~O6y@-fI@i$J K$@-qBuyxڥNh*/+IQA_v^kwp*OY~@=O{Pbj~^CA"DQ{5$Pj$T89$RP"U#v˳ǕR\PF|z8dqnr)<,Ռxr.1H2` 6rjⰐR2ڤ1kuUth a5mb8˼X/ ZIbiJO|b4.MrXN<1k)Tak.X\ܱ*q)@h"/a 2 9v%Ӕ$FR CNA  k,N_D>_5A4LS x+Jf+Ey:et2Sw*VDD&jZV5!_8S#LZ|OJ{ԡh"3;LRgV!sZQDn8ơ=}1ez JnhZBa,x)fNo'S?* zL>_7ʏMajbl'[oNGŖoO. ,XKb8!' Rh >{BqЉ R MwĆNPz -tBJl/ew.~|7W٢r/mβ<7EQ kT_R *T_>Y_ˏ=+ݡ=Ϙ[gk" .3y6|a 5 o\~uw76`NM>륙` O:~jf-*_M8 c]K:Lhv"kC^_@T(o0H_27ٌ+xҝ~-yrwݛ.{^Yày~E~w׽3FĻ+Ըt|ϳ A@,dzrH*q&ޚu0Ue_f<}C xh8{f'{k7F8l.A/_={[s;9ӛfi>) MM? }}j+mX_$> >Ć, bD4%o i3< ,Ug rï:̊f|RzZ0G-?'6G i W<侌"dh"9͜n?8l#D(0H# 0$L0e;ABXܹW1ESA#|P?+(l1w`)7B ް"8hFܜ-3$712'stb|;?RYX&_*#d& CDYsC%j7QK5KE=( Ӱ= XT N1 {g欨$Agm߾wVtVa;+FuyטּwVS"Uĵ\rVXKD1vQ Z*HP';+;+n>%Z;+~XgldCBfQ L2pߍ)DRJ Jf 0(UפJR r/`NBQLo'cM[mذps1tfjiq4̹r1H?.5l8?e慔CMXJhw)$RU%dur+=r]Nu5u;Vn7Shj9a9}[ EEyP,KŢQ sǏ勠bo4? sYSJvWH(_^*uʠUApʠm.YJA9P J%FکV0"sGը ~z jݍ-ʫ"HZp,;*U&m72*ڲ˭FRjE7xjoqGPH0}l,yaX,q%uZ.b .>rK u!|XeVʯjZ:,M2jvSJ]⠟8#ǡPm$A素ZLZkRTRa(u∇$!cp@qht"8 }:XZ0jToN]+1C `JR.4.Է/t;l߸noלbxpd@rۚ[wyMp}BA `Xi( M<1m1 5SEnK.5)D$^2ŋ tU4DڕJ,N}ݴޥ\*)Rh;RMl A$RP3b<Ir,bxO=\UPKQ! /e6e|KdB0RBgvYOͣC4tGCx=iwӳ>{'O/xv~phBg!ۅ= /`)uK;J{8/?\8q M<(ţ3Bv'P-iۿbڑT]ҷ/88(~~WKƔCTHȵO^oy5߿+{?RflMv& .CAҝl8<]Fs߷az؋7~&(./Oe41M`mrUL wxY >2+LqV\+ӻ.@pҋGgZ!?2Nf4O*M $| }r0:%XS;1u*.5MiS4L2! (jp_+g8^[nFj۬k3ē -DQX!T4f؋3gdmW:7[Ez#7[˭c 8387'xƶItԹ+& ezM4}4`Cr/Vc Ei -i{ȬU[G>?D!.>rC&rg9oV-58\K5v覒`}Ih9Cj.ZᢳO(:@jhE{hhe*{$[TM;Go,i˭lHJwh(e "&ތ;{`3)SLZGCplFKV)a 0S^g܈:kBd8NԿsb/k=~yk|ڗYɞHXXOH*n՞3]'윈(&U9 ːfF(ԁ(͌Q`MQjCդN)y4Icu- Ir8 #)ݧo-$@ETT[5 joBzsmjm(eӕ9}-9k(źr3<}lZu h,8;YG0Uds$ՒQ7ɢo杍 q`fi@~bO"g8JH;bV2JiMbXqA '(Coѽ! \# a)&8_@mWKKPFR\Us b1FX 4W5ArWNm^0c8Dj=i0h*dܥ wh_.(oo0)JVSԪsj@2% ¬XY=LlKZ޻caEv_-_$R;R1j H" Y?p5@q$DzJ l #dUVT:<rĔPG_!/4OZQ /&^3y46ЄI#S)1K5*1Xs+Tt]Vr]21@%H*x'T9+Va8 /5Kq0֕y fLw6? TbŬX`"NL8r ^圇X-D̋9 $IDŽa+i}*K7 "G(Y0>p*@ K DnMa퍍6oĚCf792H }й1LIP lVdfaJ*O'jel;@iڃ90F`3=BSqzwiyGkdqcATX"/,b[g2ƃ1gNZNIf%6kN*Ɣ`0zs1(C2ASCT00$<ί6:^ '9سKf' cjm@SR 4N͋+݅atF 5-j 7+?j"XLa- ZxH95+򀀪 dRt#l#1Th PZP_07j͐ !;,3ɱƘ8Dګ\ywvO `ю iu7]-8:%R5Xz$U7&Q [Ekn- wt*ŏ BqF1´W\E~X}bX ѮVbat9wǃ![*|mN`uGw%q44q 5Z%JJd=N3 wse݋uPbT+7 C?*z[Mef#"QscC$F6FcD˯^ qR@lPKBM|:NΣHpOfA8Ǘvv<;izsr0C{N\?yg3|pϻs`^Ht^Ng'\fE CƽvKIvO(Eәp*NMC!~7x*O~pˍ-uyַc gta.eVGW7&VӤP^QxUx~?y'2x;6ypxQ:Gg'@oϳ/GOZ>xy0#ͩ꟥o_0BKg_ /SSq >>4weohp5n JiM@4X6C/x]O gx1 2}'og |r!~0W zoLLOzq^?j>К̔(mʂ*%Vxrr_FMn?3Vo_FW I DBMk6>)!Q-Vs=댝{-Gs+&z E2!g~RCK5dďLl='FˬLdl :q9[;1.?vq:եguL HLqr][TbDN .{;%Wifgmk}rFj>n*jwɺv7q`F4kz̈IICntA'Gp9ٷwc6:6<›J#RgS"Xq)~#'ì/f18yxIQ~lv(̕Ͳ7 Ѡ6J;z 0JQ'>s|g8Xy\t#!7WAjG,z{#d`*4C0Bzyp´R)x?\Ty 𨮚GGk Hx>cMnYp|ܝ< ǧ"˱ p4C^i8pÄm:8l-,nӀmw  0;!8fu@(%9ԧ88d;d1w#Gz2,};X;UW#i]eQ>hg'^?6W}u,)$k6a,~ר邩(qDhd1Dž~i xf>zrj|X/[Z~?߼A;vRL BS2aX`8K9)HH,RdXdbX7T+'ۦۭJM_x7uA@l#jU:wWi[I'>lHJRQ{qhnO]j8\rXhܴS#c7lǀv5HEU ~LVvBBceMul,:~U1ڧP)Qf#uq Hۧ9alF݀R @JҜJUaROQJlXbžRAJ $[AZd{/>ZLI|`V+w{Z`/O;6r8x;Nb߸ >"6(4-85YMjȘ&=U~'EY| Y?/K]|;*O.nxKG,^s(#;2K N fO/[<(GiJ0"rJt >[: VlAJBaTK$Թ%6фdL,H T81 ;kAC)e휻&Z;^4:X"ldxLTS46;箜*&RlaMEV0WC_)f# K箵HPnӘvno'>o0;py'mBcgx:  dLSf^=<~ڙ.`gԠbK=>*rI* f⡢vVkcS>+ԋ䳫ּp^6j^XeיѺ/q6~]IjD::a+ M91G$$D1PbR('1J.5N;m4J\L!ivA1Mp(n#5~A#rjA:#牑(z8*WD-UXz$xU!&h8љW8(=Hx?ܹ$ jViy/Pۻ~,!D髼 2%mmN HcI^ |8Bw;>F!pw|EgF{O_U~gW]̘N.e Ib'0453m%z">1qzYzz[gi1GRIMRũFDs R$Vu$S11*.SxW"j^$U-RJaWrp櫜-Y>r(%"$d'b)s8bJ(b0/B&NE@9 $BXʼn`2'$%vtºj7Og8){ݘJ;*MYC̟P+bQ{tD dRrh4:MǨ.gvkЪڭ YO =u/l4:MǨ.w1lڭBj68g&qmS.d_) ֔UXu%﷘.v/vkC>q)QHcEnͧj?]}h9EqƈzY0q v-U]lȺa5;4PJZRׅNz?ԣs0xbAILr<).bq+3~_y?r|oO?~9l2{+^0|1m>|z]n]v|pF5UR@D+"ޜv8$qF xgXǿE+[D1Ɔ_O"KF{ZdY@%{\f6.]Cރz,j:6@npһ߽X\Y6lXv0ySWPsP~i B SLL*FIjZ9䤁c9 A)347x#} dL7 ;LᅵgRtf_OՏ{o!mtl>y򻔌X#RgS"XqK_xL/f18u4<| aDmEwcͳ/7fٛ^5mm Kc[vp`ZthpzPhIKֵKV H8nT6:<yl @4 [v []G/ k`X f:> D0?ѥ:aO!y;HE UG^%"'qV˗o<w`nfy;-y0.fwɝڭrscnn2nvn!zy?$oi]7We4OZYL4 حd!,Y,YG,Yj t%'"4J##sIR&JzRVvvN &$"$x$AD&cYH#CNI9,Vʗ'&&Z q$)54,`aSsxsz$|ձ|GIYDcSCR/Hv$!FDo jAId  pD88U88;u jF9wlK"'DT@"řl)ۿRxth'/D|# {98W0@+1(8=pz%?(: H۱K* Ren9~I^S44wSSslb2>'SP~/K YO4vފ|ʷOeW5>6cè[=aԲaT AyսaT aԵaTI[rO@5, $ pAnu@MDvٶcEIpL4> )iHO@S> #B'aMH== 5HbBRHMྦྷju`V;4[F,Rq jRq.Vi$9)U)YHZpuՁc"wjA0[[@E"8l;l =#8l;lu $8P2ޞQYVnF%fÝ1RiFZT:[4x$66Vp0b͐pQE]@!P?~R/OKO!Қ#!htZDS?ߧ"__*s5)N[Mmm9?^ Yvu"QK%X8)HGkIhR8|{pٙR-($ P?S A,BL TXϔjA^V8SL:$IZqL3K3ݚgv40epf0so;?f`x#.ZC{EheU$ⰁhŔ(F V@*0B2؃}6BUKڃohہ ޖTTNGyƖ y18XkB: 4\':Ac( =N8tkX9P}ą|y0Bpy7aow |yZ |s8$lmc_ ڗ`0#%Hk( sB܊b{ۢ(u:%́G4/Uk=o!FXū+b=ٳ|ReZ FB!Z`H2Ru#thrOFn1t.N?9LVf}Gϐ)U})ә4nkոóMm xBwhS0hm>jhc{>JPc#XI'fIYL?Y@e1D%4%d1\I$@8؂3-"K aaZD_7'i`פ鋞%!S i~ߖ;[xi[1@2dIɖC*mm7WDEa+ !Yjĭ6J NԖƄV2r _ؗS }ЀX\ٵ8WDƪ4ջfܩNaS%K:U)͈;U*yü ܫ-C{_ +0a^%^%Z^%l!gUz,K.HϞ ,3A˻/fszu'YlF1EK c,x.l<^ yDC (ioNX A'3?nAeaSc5!J, f24 \c 7'nb-Wcj;z #{U!e_ d!F@@& [Ps[4ݺ~Uwr=;C4=,Nk=V|6?n(E#%CzUF( ^Fe~{pt']K\z&οZ *P Zj9f'"9>NATh,MXWh|G La="QF2H8@'Č&IlB\J1:1!Wh5SmO$#0Ќ2ѡ9H$ E b=tD-!|e!b& EP'Qhc %HÓHhØ'R˄DcCyqOB@3؝-nQ~B%3x^@<׌NV(ʢǙ\š! B?(AcC'*PqbetcCqGr7v>TG "82&T!F FH( (e, =|1~!T6La?$rDt= 9,H )H8dL$a6 # Seߦ$0D>4&'k\HGp4sMeR4V$$ȧ$jY'U@!NW=X)?< = $.@2о{h> {ld\j$h5٪RDbiL }bjP38sw^;qB@8=N9lER٫kxB3SQQJ T)mT*-x~U(fPxpRwIۏ 'H+F/,@.5]ډS5jwM̼/ Bw€eY(+),pYCR-:7хI,d3Dř1W40K8'gf쿪< ngϣLU}A;dݻU}^!hydwǺä\0)uݫ/ul1URMc8Fբz9O/Bs[R]9 "?)ks̞uZl2g" ix\d/WxlW!Cx b4^]w S\d1[9>Uy o}7d([gk48<|K)qz'jM wdTa.Ҩ~Շ@ "QtaK?8\moQh0QTE[=\{O,\LtmY mDV7!z9-}~\T5,s*QW*ޛ=T T*= sƞ81^W~ >cQFSGVP\/'$ͧЇE4d \ŷ*]XPלVKjj@$*YJjPwy%>d/}@>IGV1R7J/F| {kRK֧u8 Y:WI#G_ґ ֫|^G76[R>~mF-1nsGq0Z+)( SI8 \i-lc6T4hnP5bUCQ%C`pr؛i-3Yw/֌/ }]c `ݒe52]of0g|'`F/%5w j97sja]7_-Ě PCL'P.sRx˧YOpnpk@sNve-g^Z~\JÇ qMCkj85t5^N%0{.ق.nT @DXz8!%$+uȊRAoȈuڅlҏ$4@Sb6;y|x|8q5gp>Jųc\A8_Fe뭝fjN@dʹ@0Z,í8Rz{eʑ~ ܸ( Cƨe9gG~ᖂ<=<IN!l I7E {.\ˊuD(8U&c>DŽ!؋m=I% KUa{GSS2G 5aGJ4,./\aK {eɀ6aEQPe`1ʉ5N"0cJҘ%21Mb%ZA/FL#3Rn)Ra$-M"iDNJqG3k)Ek1EI" !! gs->ۦFgJ9U.">"R]-|>[hMAWNhC?BC-QOϿjOV4!DlDZʒ4DD,L4AҧKX5Tb)u!5ئ{yKrGZ0>KQaLhnO,gl>f;>sL J"H`TĔ)^EC%Ӕ칯|6@Jqt7}e'AqޅҶ#BTSZTvV2wt|מlu.|-ZqBs PHh/WNW&QiŬzlm೽؛ЃKŐѢ> xpsɾBKBH#)/x8rvUVzh:_Eەsp} nv\fؓ!tV*Ւ_W`'DZI3MՆ"NVz/PK^Vx\{n8>6 !iZv{Ȉ@KLYGJ"M}՗An æw2zޣoI@y9.w<] _T\ap籏#Mӥs妖۹s&c=UY1: ZmB2{f)L/\zػ6W >ai`wq|Xœm^搶 1$gȹzX P=U]GWWWyCLLh$mWɻ H)ͪ[6@Իhf7̝!h@RT.Z*.bkT#[{s<X;*-(hݒ!+N5OrE,' Xn!, Ū]L*9ڙMJN.LP&lQQ${g Xuڻj@^̨BӴXc1_#4^V2< pi(Sଵ=)T %qF61Ԕf5֨g{mʌԜblHyH'}*ij~׺-3r/.RDh+)r%"Lw_ԃmiiiӒ\ .U+fbwdQ|,!2,<Iec7s߮%9yY.zGStG HgJz=wCO>Lki)Aqtjp̱TAlߛAk lN_V@ɧɫU3%$1<ݝVLA,[kR*t׍mNTK'_q$n2'MҘN?n/G09#ްH34D e}8ς6/%l?SG͔*d$3}(7!=A \%^1@XIK% .)뤩Oq}hrٞ4Hi:R=l4р:3`r 3 z5ڐXK^$mtWvexPizwlJT??ǹbNAڊopCn_1N'R '_([>"k)9o}s<^Iq.^yLLZm_pv!A1RzȍZM %%@2l 9aAyZ1{T] @az8D#PIl\b̍n$s7ȸfL.4˴ya91}2+R>tA$Œc%o%Ut={Z@l_C9)z(%?~zkT}  l I@AS\ W t)Xן R$7)݇jFI)uR-)'.?US^(e4hC4E3eF*F2PO<|"Ęe܋ضeF Q2if!Ԫ{c-Xas\I 4.\4rVV >QBPtnO#BQM"E1%(bJrߟ9t9V7ŽWxN4;62А/ OiT}̹ <)=Ws9 @g6_^_cgi 4ş-VoI.A%N nd/a ij"y 9ie wpE167Mscӯvl6_R}8,laà3U_Q%B{8p)>da׊IaP #ڪ-`_0Ћ`/=ڏ4 uqy?OἙ0n(ΈoU ̗ %@B>$LQ..)晔 ;?fBN}Wd@qS|T !PS/7OHPBU[ 3(m<|]񆷽yBOeb~HlIJIJQ)WIlwg gYhIaYܲˏ,巎ON b0('Ăw<I%ZV۫jo\VDh Ya+䉳` F bK0eI51] ogN dA,2ӎX.m!,TKs DQ°-V8yԎ6HkJu˒90E[db ʕ;4Bj%3~.@;ob8cl]}yxa" K7*-όh~ϏVC"po6KfcK"ƚpKbc #!׀b-A%< r*K#8:bWVYΜ| /:='\`46].Aw7`^ N;靻߯> &V[QU[=7=!JT"%һF1tǏhi44#V[[< sP?t=L҄Sz8ǃ5#>>'0f}"^h^]5nJ-Ϡ}O@ޡ$z{sKz=$Dy8 !؞ܛAɛyOѷ;czyun8r:Pϖޏ b1/p&\!tG/K#gϜ/^&051.0E6iU9'v.df _gUo5 r |j{p RqrӴ`1`THD,艦(Z0(Ic+jfp4_NqsTq %Z42W$SghoχiEksG"$ sHB\a"_1.x 7W+V!wUozK-sӇ%сΧhdL4x6@/Z)K O󉄍ܠ=0-{I\Jbyz7hb]~z~l%Ij/%sW7PQ;͵4Ff8q}FT @= ,G׶lz7AYĹ} R::XiBD"= BEØnEBsh)Gbw<u: ܀v\F d" PU4to^5 ? f#)ˡ'vHs~ KY+tSy0cAV#۽1%]]8]uDo__$jp;PKߥuve*8V`B0;{ p0nek8J+SSc:Z19I_ذu@K6?%?㺓ٝdvg:ݙNfwNfpg5W fD:d2xHIAjvŰp Fwp&%ҹϧ2=,os.˹1&oENg/ lXO"T S 9j/Ԭig|@SH<:t}XI=hH),F˸XBA F絩Q5W30 ? 쮘9KJblVB`dQΐ0Eg>5(e Uw"y<SS-b;\R҄imia 3PGrN*/HinCbd\0,{b՗ק6Ã=ざ?,Ԙ0}O``_n7p D Y&'ߍFӵ ϫo߀667y>?fA9ˑV _' g8H=n*-c -شukɩjN;Ń8L7 pr鼅N/~=4=b/+Fd2]oY7$tz0럜^nnJ:,^:):J~oRq\p27Iw] ;&vb M̎T(m!3mdK۴KnٓYi{KLoz[ یXozۻ͑٨/[1཮H%1+*I,kYmҌɄ߫fbF-#-3l7xyQ:ƻ A8]AN6ݠ]:)R˞XW/Z* K͙"#] LKY٤uMj6raG{Vdi̷n_v\ᝫ*ugitVΎqAm2& ¸uG|Mv%SLMhMgq⢊iܽ$sZK.Fy &E Q[V̓8/6}n5M[T6\13#=%Ju$̸53YdLmK.(m8}3\NYQf.p-'v^x^hsXATr*Inw=h[aQTȽ炨Y c] ^Ff58NMGy(ti2ᖣ;6gFVI͑T\aY$EˢZ"wq| e0Ln(Rdf06Yb:/:(k+eG] JRRx w״s&W˩;s۝*{ud`h$V{Z0K+:T@{hr>Bڿ)7M,rwîQE _ 0G*SV0j`)@=]mݢCq ]k'تWzc'yKm*tεdy72pf LP 8Tmƒ$5!/?T8o䱪hU&]k|SJt?Q(n/uy+{[]jRX v9ky'\՜Qd,\- szِ2 `ƹٮnaqI6}vnwdܻs(fe9'sy9-6Yj\_F$vNFo MhwfMpi*tp~R=xLW*xǜ0 ̎ ֦͠o8ٲ1kSckD; -BwLz=vAN1õy\jc:km5f[nkHUmԞ\«Τ&1i{08W`(FM`DŖ[E鑏ʹoEY h)I6`wfCY/dM7Z0b LK7ȴr MB !2rgcRxZ$4/'8\e7: T.yEy6w %BS@5s35rJk~yů?gS7k@Δ j+p< Ed@;t^× KϼIKP&#!rZ0S`kprF58Mg~bFMNYgɸ I'7 {wMKgl¬{~=2 m1gζ  6$ISk'*$hDy-6F Ս~\fe+K_^ R^>j)aaX#ۑ*vpU:Iiacֲ%] mcJNr0S`ӫKg]l6a_it>|B=Ƿw4DSƍiZ4nMB+/!C谉3q撩k{=us e&h5QU\{vNٞqdQj8E;g !Ř"8޿pg4\Z!< sV/[x&?S9../9`npt%i$Ui^os drFDgv,emטĺ19AUcP*ҍg=6kdsk].%*5W73?;{J*W`fFr ް?=wwCߟ;=c>ƱИ G fM;ԗVTK\,YQY)ئbK-&)j]{TUycVcAWiUyCW)D-bX1 U#$DMY+rS& I&bc, g7*6SK{Sl!ߔj[k|sjC[fEڊVp h<-%:.oIJm xv}iS vIXlM"+$h-Q h,`1^v3(n;tڅ(sVSF|ru{ $UVت} W,y)` =V~$7+ل{SBu3qcؼ|g֩Dl |]p@]$ AXZb=,@LRJ<,-j)hdQ;f=4XD)%PoXOcA 3ѱ ǂǒ?&tcI=ኑЪm8Fd_Qtu_zE<&y= 5*ZCׄg6oC)dqsg*d5qzY965Dzrewxu*XA$K)ʋS}GjxEjx钞/WeD+~xsE[8Yw{.*[ତ}-kEED!,='8h|N%kg_wvۮ_s2].➕=*3XDQO1 $Kk\DP\Htx :߷g(FȦI8$7#C CIp"%E` Rx(I 0_pdI.*]ۄhO8(~gŎiڍYn)畼4shigh{z3AlO_/NO_W/'gHdM,Q ?ug $[Sqףo;Է rii*)CBsXJ/L oc49W$Wk%bMzX(D}*0f}g4vBB+;qf`Y^G0hD-'xBwfaWOB p2LaUBa ]3))U!(/) TzQ0y.,g>NٹɃqܕvwCgP<*ۆ}vکٹ g̲'$?,! ǞV<P$p3^'~v3&`B-pTt@ئZ!FV$F0Ҧ8&s薯 RqbO6_"q~`m5+یXjz"\C‚q\P1M!B.}3q} J=PQ2@,C8=[uΜ 7DI0dBy 3¥%h#{dQ Wmػ@JEe!$;eĖHSZpL)j4&i-%fm(mՒ*i{h* z`Ay~~n/{^㮟]h7ݚ Z b߆etf3؁$ Go{NRA4]nڰiߛp0oӏn߬^2W06in2+~5}1Blj&zKB]I26u A)4ژff)Js+XSAaÊ+[J;bF)!TY`GlF ]J<J~Hu"DQâу5fJULV+B\Cn(%h,C!]i) o }&\e wq>^磴6KiԆ`6b牲+VTL&SC9F_F㛑ćxrklU[!B JҾGwpct9.6Ɋk6/!韞o{y:^0tr %|cOr.΄h @JxR)zBB)Z/ ӧueJm-Q 8lrGP6ƭK=_e:ɒš#c[ilh>o6K]<0IXE~:=3_ti&wE'P7"б^4:ϫ&;Y_bwB֋Z]:qrФ-AQkF쟿8|sB u n0kY94ɿ!?ɛ=O3. wO.F_*77oM[u!U4oC`/n4Ng'_׻Yݽ%ඟ$^Yi`n#9]ߑx}ݵ=U O#ꏢ׋o[U$jf-[,1࢝IzcgoKK@= e-n9;7.sDds69ygYcySh_|S|O?@O2thsc8G-g\HLP\}S ?Y3*f :hGS61h6\_,ep\ZvXTb^Ԗc|{NK4©mDt&X7cgǛ(Z4cgkw`Wz+y #+JQJL\_^;CgrǍ|5{~xQ5cӘ#gp_Wm։F6Y"{~22H>?{c1ԸRTqB}Jr+a jKDЉFQ~Mi%JK޿]$V.e+(gEOӢs&^=-Lt:fgz~;}ṧ{%W}~O ~R ΄~,aHM?<1qO ^+S 9W7qL˥ r/^Eo'Yp {t_xv;YSr~KW_$gd Ji|D lUd< x`~j|ejz1 .\}2gM'w]F<@ 4wSoQ)eD6< xOM7g#swT+J)la3sᱶ +p\p8.8_pxTT\p`L?.8&1M{ gv_}4sJDcUGPP P(Jqgg"ϹyQyTݲjb}ζ:Q Up:t+`i!8Px0h];%̎{wNk6oǍc9b_n/ R #Ć;Q"U^+'O /C.)p. .9}UUVD=<.WW߁Ct} ^?FIsonF~MG--?/;>^Ybv +ELzk |u`klS @ .3,(Yh\WW{,ŏj>3WCPÅB1NJ.%Q>Jt 7nk ;ȻhNbt_UI`Fr44²k*!zgPPJ9OʭOYZt*{luUs@_(Pps,l oJ $@#ߺ;Ejfj59x<>Nid ̔Y}%ϔBʆЦu0]?^<| W̲}3hUrq(68+O' ?qϓ7%nDr!};'{3)yǤm?Yr%PCګ;؎?2b/xn' }}S'kл%w*UO-U/r|_/^qkrho伸j9.ﶤW';Ź=MjG8M/ 3r!Zn(zVO$#]BXX QR |łnGe)LH,3 ӈ3/&tOev\BZ·u_p'~*w^XMbĺMW(o*d}s s7fr1㆕c |FH!GQK EO\ew 8٤x֊$Zg#e-2 #+dބ`UtJRT$r8_QTs:Lhqe4p$bB/u6؈U{bVnvfx֋>< H AY̓Â!'3Ha:Ս<2jluOz+ ni0 AΕ 7}qO@#i[2\k/dxE S*voaEBRD(iP2|MwgjN1taDԼb2LYy#T#00{;NϚ箬L`·V#;:/f){yב?0oF_#eKgǰJ+f ?փ_ctJ20br,v 3RsE]U]L>Tg0m Y%Q%4JFeV ͠^5 g5Xz3xRM\M]zKg)nkY~0|,d^6tZ;s'vk;+ Y'hCr C3ǰ4D% ָHQQrc]3 P[)kam~4,AsO.Eg~ wU齖L"qZX)#h^jK'xT 3 /$*,= +9; iƄʀ,u:`M;44kn$|(#A{#z8: ^b8n%~/u5C'8)a`py&/-kTH;߿r:-?hi@7!ˏ$/.F&Nw{ΧYTyե:LX`L1j%/ϝa/6n:2GG|Hȟ\D[ɔVR?n` ^vkA}GvF 8Uz0zڭ hR!ZO[z%hNw4ngZ4Wu!!rݖ)d1QtɿvfB {p*$X0"ÊxNE΍v-02o5K FQjˢ TŘ t.a9hVJ{J4%vmt[J%UD(VK#b sWIU3h$KXKYKa,!a~\n;.r pLTB~,a_j1l>C^f@#tŌ3|P8E B; 4`X=blcz E0KDŽ#kxjk)U?d} 9,2LqQ` Y*.`ԲPK'R ]IfIZfnyl'xypuWzOUxQŬ*|4oFͤd:߮v#{^֔LZT)k`ʰ/}!Ƥ0ҏCiyoT}->ԒB~ڷor󉍱J}{tV|cmIQL3cc&ܣEP yo-Ō|{[ 3RFQ2qڍ)ᤖMpa+ڜe;faXp}6 \T8 oBesw,[XOpl\uk5̷j&j+q1}\.,1Z]v!mpvr O8O;hBQ۫6 bV7ʿ& =Q(N]JB0e|(4 J:ೠO^v0]TX#W`>]٨ <VR`lycUJreX0w eeX9:G@4<\#g$ , xE0W0(:vА9Ązدưg<2.L_e-օy+ ,(&$`B֎8)x rIŬD%k)qT%1"m"݉ B~Ĥn@.ZKycm$z^%&k8kIVP]x,M)* FX=WȼjچG10= |jn-g^9Ch i+?ERt'A\K ;R#4_(vkТ7S d"JGSx93'jD) aE 4 !#h`@XA,#(AXXC3ÇRׯ;TsnDV<7UM4 u?p(|%4-k̬m6Ll%ZgOnxTwLu3O'wedEtr)$e!졄);(ag'͇N>K:s̒N)b(h=t+Nד=>t=x\+x-si,5$ 8/Ȳќ}^ "z.?<}8:ջXlkVfv]1CX_lK窳[_UR;?O㕴 Y +XpH0 Vx*/Zp_+]n+MHĄH;MfaG,=ΡA1&;3mNf%ϗFPH6^4JEv'^f EEv*G _4pW1zE)@ w=^lXH/|Op1Kb d7\a0F5#+\3]O-|fS%e8}/Z%. ڜ}|m^*} VIKY0xEC,㋄Y$[b|9;.XO36hv,}s٭44oC8;%;`u)[Z˾͓V^Ɨɖ$r<.Ny\փqi('j' ٌVע1<4Hܟl!ayzA=^!'{yX ^,#ov2ԖNpBG|mZR2W jW 8Oϰ?}tS4T$JY/[`(%mä1 `:#ùv( cbb$ )kIS#t!Fi{$Ė ewLZCJtJl@[˅3%: ``8F!Bp3Em-Z6yokÇHig۲+GGk 2j_((H=2{I@uF1 ಍8+Q7kJ\Seq飺XSqZf,)(x0\y.a~CAFpT:EFUD p(P3fT0vADšKq4g :K~lkK1bFJ/ 0ݢϓ"X٢A/jI%/E Sa"k)!cz,}M(Ylt(c$ݐ,:ZC |ga`#= &0ov{yn'm#;cټ682 4 Y,i$9;*I.)U$]||||GVA=Sż `'~UF+h4HXEPF,ʍNyΘv|:JL_nJDݔKdN"[K͸&6=>eD۫n.NIdO+۫0(B$]۝%YrշV|+d~++E%O?$Iy,5MOmښ#WHLR!KD V[W5HKl%ݼp RU6_ʺ%:cۘ:%LLlBC0-Em̺s P>H.CQܽuEPQqw8`섣j]iO G5h]ΰ$gZ@sN2R߳uHŲۛJ7Ubq͈ȿM{?XU|JK[YJQ:qeft"v!FSC;7JvQkAӺ %Dkz7K!zF̺A#Br"RUd"'Q2,qc{NಱIh hsV!'Edċʵ!*@-D=ML4*Rɸʤ PM` 7HXQʼXeLRf=S_EaJ:˼qZdT0Ac#`#"e(Ea ^j4$?y9"Ш8~3C:e09`342@k,z lD!\HQ l@AfD)NjKj#D*֬vQPB+bv]..g: s'1@k%AD3]^+ m^DHdBSrbrtC  /pXVMD#Fh8s鴻'p7? Z33KK}7n. @ AKP6.NQДiyfFwN8NcE`*b[CK&p"i'`ru"ON!V֛)^|BT^͑LMc[5"3[%r=#5pLxws=.4Ѡ"GUPZRK sE4_w)Ơ{wԌq?(^oFOqmǟ>K\ 暂RDy0Hm`*Dy1m9fk ]A\qt*HElUm jnELA%:j}$ o-ifrCxQ0 sҬ2}Ma׿<0]s^FS:rVci&&qqm8*EW>ǥ!>ŪSNKdo>݋lv>⥬7 Շ/R"CiybT!d+$"t+⊽QTJm|x79B}j<)C#Ί߾wy^eQgcOs1 z1jͅwF %Qױ74 .h{tW2# +* zx٨Hȩ D &%LѴ.u^Tqe^e(-2e6Gn-ƃ`!t{Z?-MKkL_)Eĝ j! N8𤼪qu9V;"+Qlч;O\X R3k5Si~Fi1?(-]7VaɻkBr)d4 [5΂q٨-E;O H ;myrش`p8˳(gQ.Ϣ\r7h֛ˆeT8f.Cy䖣iĥS5ؼ9B?[ذ.oGwÁبVcڑ!6/[>ٶufnj;N\ ž"sJ],9f1XN]7,%Ln+H9.oz>i%,T}夦q{ʼni t|$a&B4HƂ-`ff ereZ=eU!*%t$B`T&*$ԈGՋ:Tޝ9 O7ЙitV&#L6 aHpK1c賒yT ű$IZ[;C4r5iߖ%i+!O5#r)oy\W8۷8UENh?i^GC~Q]}j{|Z8] o<Ɖﭻa"$DAEU}F_([ ƒj^80ܴ|aM^²= UPG&i4h5ƻJ@zAA"%4aU蛹P _:!AcltmR/ӈ)rKSk]L?q?>[6'SQqWawr<.oٚ=Kӣq~<=i Hu7չW'+80mo'Om\8znm0: 3xt}i8'u\#<0%EF~9;/2̟_?z x3hgaeP}dSW~˹1P<Y>;u:oOm.@?&?R۰  s 1mAw#X߯Ώ߿/P /AJ"*R6`/Xӱ}y{;jz,E1_ 3l5yeIq+>9/llvAMeag 훿'Ld[b[iz.& &TP ; Fn*PUE=vGc!L>"83qfϳn>. گvGRJ6jgV3m."Q7~C~^MІĆP8nh`2ݝ2!] 5iaMi݄oL9Xr|nwN#صźX}٢=Hbr=fL]4ˬ_<)TA GJ{^p##jՠ憕x¸*0#?KāamaXJU"kK,ŎƊ)5gqܟqYyܗm)S Cu2 $.<8Z3H8嶪~R_[DJYٲD;2@2R#ƈ Vb D a+$ %ZbYjj˛uwgQYGD8*ZFI 6pcd$@<8 v`NLI”M)zshP&J02 LfR 2A(ƞd{n)HSRD4%kV1Znܔ$=+RNs"ma`!j5a = /0O%S3D@}j*R[јΙ,$ }(8u`FJ!e,Bk4FbgF#MƥfMIJ]25scE u<1qI(!'>z^ cQJx A# hcE I:b@,U[i&@|2Gz&>GߎS|S|G>OOOK㣗.kH,o8@zUl} ̿#@$B:ٰ(iT8ş^>,ǏkB$.őRb0ӎ"a$95WPRz11%!"9 z8u{)`m-W TTU21X[L'Q}pY8XpEuw2ĬZP&P%Mʈ/wBR|7v)CSwbj! Q\:W4TG0B+qie`L͐#fI3q`hVz0O=1J4TL⦷[A2vHr(+-Z: v(3_%c`s*"'tXkuWg2٢^e`.0ȵɉ{NÒ 4^?%/& fT W<m#Yb^rr~,b7ٗdwƻO&) FMdGQf;F"dޭP'"ǫR1oP"D㧣?K'R0_É?MUשj5cm=O /u4Lҧa *#ľ ~۶wj%z. 8툯P-`{9#{}H_^E۫߮*Caޥ_u-Yb;}\3'3' ޫU}r57(=x&-LX8-_Y7S)̜Ymn%n8B^!deaz0Ѷޤm@SǰsR"Hp0pDk?p[-siT`9$BicF#QqS8NZnkI<&)B0BI-|\sZM5,SZ)H.Rl1H0UWJo):x/&N#aإw Y3<)}oX,@|q\f&t@#Y0[J }5 Ԧ FR+  A[iBr-:XM3.h` a>Qҿv 6Z#L^1GmQHIMXzNbL"ׁ:/ Ęl7}ę_uL7P6_ p:l_F@17ٺKð19C(>n\śШ5CnA >^^NhENϗ+4a?Xp4Y,hc*)a1)r e2UgPG"o`YR9+1=>Oˁ`}< /H!W ~=-0խ+kV5E3_+X#UfGDب I]EGީsTdF<* 'XNqyLpҽ9ѽQa!7,#]5 021TikھϝV29):4kk)LLYG9DXW405܆Cn:a `MVpzXpG>|⼡4' xN Ӂc2Ltf 4O L-%}Yx@xԥ}4_ܿ/z0n{*M'u5faq] '+ З^ؗZoZo^ؗZoZo^^]THMUznpR` 4#(/5Q䘦d&KLi4cH*fFc )zUf+[8t :Gnc{ồ;A Ug6C ,Aq3icABBgaH:}qfS!N3 <2EpR#(>[!Io((Y|u%KLe?4_)"Rp!BZ4߂ cƑ|:- )ͷ m+)DӴa`յ0}d|a,AI;:*Νg RQ\N>҄S8N]Z9l,0[h`I\* GoSYw t6uwYlG`۟fӲNƓkDk'ZI5GWXSP; (PǮ Md'`}|fM7pƂ1?,=2!q65-HS"tX%L,cZkp̕4zZq{61IӋyF ѝ@=e{/)zi\(0j^jS8,e$%&qs*2b,2) '@W=,IQUR|Yk0|*TF0^e''ڗ)eۯc HjYpYRHm;&8 <1_uOy#0p Pƾr ij|DYHӇ<6E .@]ӈw454'U5磹{`nTgESQg}`׵흻{tjtCWow(])D}sLj?|k:|*󜻔(&~@z0 a ֋~\3`d[Ǐ#kHfqG"Bpss -A{InqPtkypLɝʎ@02̀2ѸH%|Mkfx_%s ˩z&tI:|Ժ<}^.W4Uk6xb|jI7 kOb'!]쓋}rO.ɳ'>r(L'p乌:P3B7,&d"AENVl+ſU_QW>!EC+<0,1L]=H:3`ZrǕdjA^/fTpjmce=$!^V^c/r8a mEe 0$b/*Pބ a_IŌgYIb}L&z:vxېQZ#oO!r0"^QQpJJqyAzgN6N6"'$jPlh"AD.o8hUK <3 m(7z]Ӧ Qa~Cc<=HHTc980=,F}bnfIk<>?a|ZN*cGm]M0 bOnoO.a^TOAyz^uZdp` 3Zpv5]yDcy z8TNc S2.Q7K7Cp4#3lU jjo^])8 a8]M x=>(b/w1& kGVH~~<քmI4ZR7Sh_|z,Z>cHMb2fR8  LqN=/0*ʗL'u5: a]Q?&D`DX:#vWWN>sD.c(5a;Q:ju#)f-jGx6'Y@轀ofALelE'$M\`EOسd'㵾<}2a=}nV(r~wl 75Pf -$f64ilXlxz_]_5Tp-ɱ]^ht(Gjwy!m\ !KFOH8a Ũy%nz%Cx2$RR"ftfL3RJm*ur9=StLg{S5p D4>WAh4}:KLF8dg:u"qaB3K,ʉ7 &'0BB"K}_w99XM-]l(ѧEpf24E3Me^7sߓV;3z?]:n&~`ĆWȓMm:m-Ғ c[KȪuERO6V+AfͅTF|U ɽKصPD%OKSw .^)CHXbpګ$hJI,=3(ZT;6CVqUs^$nbq=)MD%b@Vkƌ[бp&M]/K ύF- O)Ƌgz䜴: ]eBb!68st(soZUbU7*6ǛA*Ⱥ-'D֊$Ǩ@07b5&7z6p Xuo9@;>V,%3?/& G8bZHZZю>f-IM,!kUp>ʣgF#n:o96&lj#C$ ,ƚK^ x32/Óe"sM.'-'N ɬ \0b!AkFXیգWz8^\O7,Z;+tֻiٵ.kl.Mg /q 㰽>9$LoF73QM".:Z|0rdb'6KQN_f7-‡k_§_ dx~zcf:&w?~L.C_/) m?tjܛ&9MEgf:?;]6;TO7qek>ΦշW'ɺHj6M?1U5*Ǚ{.'ӴT6xܾX=]&|{u1~|3YNrnϼ'7S,߿{W@Aet[0hY=ղ\QO\cυ_.KA]1m,hϨoY/cVM/ZC|lb&`apĺ"?sÚ0]봨2"uͲ`NJPds_-c1/-u J78H-ӈrŌI(O(p VevZsxn,=+GT0[i•~W Õ2=+s4<}J+ ,p`Vulr(aL.7Jp测#rwGRiSTĈ4̇j#J6ų5 twffwHL!wz-u)@[tUbw,`1;\CU@eLURx/j &Tn)H$Ps餎ۏ ^ X,wΣ4X^髱߳݌:(Wq6oi0Ä?a/*"!x2b("\"XUrV)Hw3եVQϑ;Sĩ#C؛B@ XhGۭs+k) jSy *$G21pJXAḻP vQHm)G2}۾Э<rUcG#R=zpWu^%-h9I$H")e<۟ZGrI$z;ui5 ç1b)_N4_hb> 0(FBQ幣/?$' /&6YdH_LȨ Lo-O|zn4 ެZyIR1_߭?aJS,IV0g2~yB"Q>oF?{ulXmv7!eN_._Mj>SB' 3_|,Hs:ȟXs- x3BKaVD@RvszPTLdQěm.-j3$< kȯ\.ynn{J4g8uLSJjMovj-? P4xK4U~ompZ4.>n/pVݫ'DaOU(CL W3o׉;nx M<[m$ut8zCGAb!Ϫ&(ځ3bV:̩GR3edž{csU3%5q&qD7i;0)7C0W RK8&Lx@Y0:n~Z.lb;521C op,1q '!khMRI8ñ2 (kz ؋PA}B J.q%K|Xdgp<!땀o4hp~CGѯfnn½4&N*e< ͅTɘޗԆcMK,p"M#=m;)f" 'VYlJ!=QbXŜAY\k^ 9(@5NJ? PcS(9FKƉ>d Hj VkԋdЫuP>yp_<CwI*:ټy4tLs$=QsmYhAE͒Ʃ>0rW%t=Prư[kܲa_8R); Lb- v;PY<$ 0K10} @S x%CYtiY8aPe,nQE˾r =%lV ˞4؁S( I)FIw'IIEkB"]~goGmu?Gq\T!($3zOcB{ƄrLeo|B5& En5ɴ)?25\aЩZ&'.a,O+= I>|dOWuaAʟJRf7f:YUpp5_D!EF[m*ml[ǿՂXbiDP}0 .o y-(hهsv\9wt0hU{x,jt VCX/•K0-ލ+f̢I};4Y \à5˸KŖ|G-m]bf߅ 6on,ft3|=xtǣ7'?9eb.nZӘr~Nf{"T~smI50m8;T)[s>`_#& N1)zsa)D"É Q YDPCn~?]gaov4ns 6v6^=bq}iK>``k՞OEAÆyhk(+]S\{Co> RmqNm?9 &n)/CIˆ-< s9 m`]ːah.ed݋g-j\N-- Q*qX=ԏdzE6馷Q(t=hl=dROm-M`APїNDǨT& ';ô5t }?n  , j~XBmpem$-@?.3~XAjԮWc>' nxVvoΉc[8r OxK2ϯX!qͯEcB\bSG%Y'XwE?<9˴(rfr)L>J.r؂P9/#M8!5=*{OLPn !.Ifvƣ.=4nCsNxt{&dwtiRTxZ%I)ֱvP^=ptRt~x)?FPʁPJeG9#&\w*Wb[H P%.ӥRyx) Բ\u$1H]C r"LQAcTcRTjyC?::*t]ԪT&E:Jt{V9ڪFQDm#?k01RфT)!)IQK~g%JmGGM#.zXve\my+ؾHGSkd\j: Ӈaw&oI_'@ڬl*{Am6!Ն*$-B("a^HʳrY30tSU($m> V֎n âJn`ІBєJV!9UaV}Noi {ݣ* 5pU0XG2 (&dA{ ~wv,(eha#ѻ*w͂Yv/C3*wqݖ1@Fsv=WLݘ)r> ޕ^w3ԍ Q7ll7m麅_ɦW3!Xu v`w_^w.C R*q1̜? |f1 t(ߕ}f3wmzOkiCц,XxAޑBҴ{uqlϝ幜׹7ٴL9Yyz<Mޘ.N\ܫe*ֿ,[qq2?+pTcQ t1*fNiQ- D$"%`,CvTgݞTSjaeM;E?].AKd3ϊb@˓Q4`|QoJŧۑCI|] XK-\YE|iʋ$m?Ę? bJVzna;iܯV}uN|먝g^6I%}>reꏳ6I#T~B3Aqظ˴?A#ղQ}j#m4wƪnoL*5T3O`M%Nq)k^aQђ0q.(ؔ!Ib3 y>(G)A%|IRBʙ hlHӈ8 Imdn$ J So+X:#C0B0 @-07(S aDM![MADCFp0}&@ka!墎1'VQ8 fA(<C,VxΤ@e6ެn_d[Y"$k/۞GՁ D` LFʍParR +#e  ѡ 1 ,4v .B4K$e2( gf r5Ga )fb> [@1 Fp$W>dXhRV~Z ɬ|s*EAc*]\y;[x[Ns} ^ F>.$X,,z51k:̀H[鄣QKVG̱kO3xcїck{^&LA{q#QA.'3eA%ÆK!w'w8vI[F NI0.l=9D;U$Y!9q*srcGH(P 1`똡:+>a m8 Ct8AQ .saqqsAqbNIobaqq\\\\3&[ '} Bwz閌8Z^͹jTUQI:UUTB}*I_ t_1שb{MuD}*NU9>g Q1㙩$-7 Ne"EyW0bCL{9׽do\,PoK8J޶6)Ayobtob!hob!UOb@ mOҸ|Q4_n/O>|tdd:/O~(m{Z5Dz; !7k)n"Ͻ" 4mf2^nO5aFTgS 3!ʶ'3۞j)VsX6c,*hl2kD+-aTSIgݒrfT-2;ڞj60;hښj2˧3ާyv׻P$2iӪ_լ[2pZ˅K̗Ň~^},OC൛.}1Wͥ;Vk6]4~]߽(n/>cMyqN bwѢpHf* _3qrd;UZhLm=#WX7!};{nglb7M<Ѭ[~uBBz.92=/O5W}nr(v[xc붿zR|[~/TKօ\Dsd '_|`4ilyϼ[&(ﶷ3G̻ew;ͻu !=,ba p_6 jڶw}Q< }]j8&up{aS9Ԙ2)=3G 5ܥ&H՘t15N5AJ|x5fjCKM^YGs/j̊kzx5fPcjjj`"Pcj]jfX^w15n5A!tp5f0j]jjq"PcT<RL 5ܩ&@xvp5fBɡ<ԘL=3 <Ԙ;՘ Qz8g1w Dowk0wzľ܇3ȔDWc&t8Od1w 1o15N5ML#b15.5A0ux똉5ܥ&H*,%fCy1w [LGCy1w kՈ3W@n/WWuB$qz~Lբ/  ? 면F 8R5mqL2viS]&orr9GO&&Ͼ?xrCth: BՏ/OciyrӐ1!"K*;fDZDN`! 8f(EIĈSA@``L sAQ̼Cvx_=i?A9?A@XmG^&̒$.|ʋi7q *x^g@X]0v؜ȴٍ}5g/ SdrB߫dJ4XD\l5a Rhg1 UxbY@ab7jbY&S o]|m5`Z]o9[oB"ǎ82f̅vȍ)A`7Bߞ#}ZKaw2v`üa +aa :5M4 UN @46F`!LA0)$ *DR0MD8}$aylBH~&L> ,`4 f`B1< " #EH4HgB 6gZz^0b?36 vf:L?|M+00k@R("=f8J >f4G Bp'Z ނ !4NQ4%eID{@Hq;)>5s 8,6:H>0)ĂZ 'g*BBz @砌T0O> RCWझ.`jFFDΜU,fA!$ \,6| 'e0FSh @'\ProW:DZZ]*ܻjiBa?tuӅ_-M5϶FW/͉JO鋛G-魟WZ"b~??cYVbf#)W t\k~Ҕw׀OЖ5@i *4Qy {J!"3GS8#OӥwZRώNnB ӒU*Vg5}y$b:Ola+.챛|MJiRT13`ƅfO`S|*_-C1_UUu26TůZO&#c%\sY]ķˢrOYp`ne2MkW0DZw' '}-JΏn}Mo kޜBeV9/ cy gya'"uwiS&??|y[y WҹR60I*^à=>]N7As3}w܇o~:y5Jx&˿U`<6o?k@WooP#q13}+_g&⣟@ZXrK)?@ {[nqh-RIf06߰d.ü(ϺL & -gQB!(G _ ˕, \PAct&᷋ 9Ez}1c /P|=kg.-1,pO}-1Malů]&W={qTcMY?]d+Y3NgCФ3Tr9l*c)&%ʀ| Jy*iT"iTOcB:˟ EQp$׵q619|'79TUBޓC-ی*| F~`{t|GjMf^r!IGh++[F "6 (GxKҴ2Kj~u&pFFgcͰݵ1ɹSMg?tFCcRw_)ì0L{籱|4']ǽ8w"udmՇw9fےbbs?v {0,H; K+Goo"~~*t) GG%sRA}j^8r'_mD2uZA5.>0gC6/B'a6a8i2 ,pne^z&qeމS>O] !hhe3IOfcORcO@=lOϟM~ҔA׭B58L_]w&70Q;Dd\T.C<RN&_+$*cT-)߻E>Z\r$eDLgQ/ sLØIi"yXiR4C ʵz`3DڼYtVLkθ&FZ&1J2 X1B2n<ت f`3{)I$4clKȿ^H7! uoBJ|ݛ'szD'y9ViUO!+'}ƚfIZ\O'¬~]1h%_ZSc//$JW#=;$a"z ı2m*4)" #*2 @X(@ #ȩ,bkzpA/;# ;WsaLe;]6 ˧.; 9 Oǹ^migBڤq->kTპ`(&hzOD[&N8㒉Å-lm$R[e F*f(DBݵ\iqxR^,jZU|7xkLy&hP}*՟Sy&ԞgV2V (ăt}ICQIg]PB!,?DjPhs h|b5YlH(Iy8ɰp>{!\-~Z(&?_c̿쌴$UN$z .4Uѵzw؇(53dLII5k,8v:6t2TdXYNdU&y@:H~vz-3|pw1|c(4I~\/%ge:82nuɸ$F9'aNS2Fah ;D~+QVa#i($\ڢi ȬWN ˈ3$xJL;PD/f{l.]b$lHs[~Tc0'ItzuԺw\otO^݃0ix@.{puapjE]@,a{Iыjó IIēfh:?unŇ[zS|73'IHm4IyQtJU |b.xpz~a8awDst4K|c>!(fJՠE1(kvFH=0{]^M-_HWtj 8۾Si-U,G{1Y8l&{nyq-WJus$:sv_ ˃z 3+2섖 -Z5Jdx4&1p_L%EbD֭˃nz6F pcE1^3(y/7Bm_->V,KQPHdWKb zTPRlyݒ^^hY?H[<KLz>Vh&7Ρ]2 7&SSJԡ⦱yep8؅#2M+UZi5{-Lt' b{HOf;\zyk k*&=|[b薊ԡhJӕ_`]h'_`.>87O 5(ʸ~ʹ(Jy ጅOLB(} ^fs.2 1㣕%h~i1~m߁ }ލވ{j@o\ dYk Gu֘lLUFd)еqg t ;UtuUnѓ[ZyL/yiQ۷mQu~MK8? \Q%f.l׍1 f+~r-As_F~6dz2j!ҴNnViR˩kڜ)g|R8+g&<1=y.ݪmǎ(ݚb:MǨ#ݎhwq*-ݚKnmpW΢xJ8פctkA 4t;)?tk޴.Q!_9.O,IQOc4iA)oTހKBR Bb K;ftub]vJX+ T2/Un ddY#lDJ 6r;ePmB1I KPjKt{,UOQD"ZT>;u՝Brպf3]Bxb*>L5Dr[7]nofMr_H,VsZVoPL]$2}ɮfwYCdu1Ŋ˰ eY~],PɍMnc}} @E4ډFTiUgD8+&"# B|YU)$Gӌ .ZnLBc?"s4_2,c) .PJX,bcPE-ٍ! OVZIk g>R@V&Ak q8U05(AE(?{&#b&Κ4JO2ʐTF%HϨ%Y&6>QX˔ JZI*k PJExNڼwSS'xKE`HoBIv/yYE1WP0(`?jMURx+ V'Tv/%13tuJqث3^ c^`d~G8O6[R``M!vZ䋊4ӬpEbm\TGQʁZޓ3{ IicԦ%_1*9K N;{xD pCwgQFBVkyg|x$ce /G2\+y5R }H/yj,ܵc>-= ]1b]d;%єI q&$1x+]`8HN}iKatY(dL"&*E|.(%V}dc[ .E,cfVFtxÒ4T/y99t*hl|F|Fe[H#~a]4_&k{ޝ1ʲג1~&m>|]v[ZqIDJXq2,Hӊ,hWx;ĬqyB[C%J eg[+qqIߛ IJ݈ r=Z˽Y"d[4[eiό1gBTV[evS}z~$3QP `17zPD'b^&>ldP{QlZ&+&cHUhZ"qtN9)L+?`0/ QZ:G!1V\('?g.KHT5r%~Cyc*aLY+0/հw!$vVߕh^9dS5ˡ>[TrdZúц`&(6ӦԱW]նy:a \,`X~#Ɖ$)(±6>иD2Y\RNK=O;["_+j1U r@)[*l-`$헱AU1JcOܷרJKxb[<80A:g"_>-.A3 , švi w "t#hIhﵭCRWK!.e'29+=1!3Ep+"ѷz|[_=LuS;PrF>>Twu=fepR~n+ 3i'l_WEd;!N6HI`)z޷h8`V]kQ}Cg7t}Cg7i 'Ce ; ||%} *aΤxBf~(P!W<;P. wZ계 2]Y]d;/箫\IZ&پӈr( "\ICt 2m̂>G6$&q™*'Ia.(*`H|Io}<#!V{)l0Ϩh5vhG၆4txtx h^:'+]T@V:KA&p$:dfEw\632qTg׌a둓NBC; $4NB9SJKϽ{C| 1G* \L=J$ AHvf;P.;|-mnV'- {nwo>k4ԅʮn%-jcby2`hIZ{pVӫ%C4ԲPK\;Ia#Zͷt񒩀a%//0^ ROc0KH=m* C V;X'AϘ#L(& wrbHnG̫0dĠǩ=GØ^C=o[w_z:ueT@u %7o/G2 b?NGWS "I_n nmNVAÀMq5l@UWl/wZn4x4t|];h\&@5-U e>Gh\'8~ 8Ȩ=^/Պ{TjITt?y7 ց\8a^VNx!'(1fKn^g-T\aYHC'Cq6H)>Ē!. *m[nCGȲYcۭ܁Wv3~l\/ 8, .XWY ]QZ@ [8uV`"dLN`y0U6B`)2=hj<+3#6F4'Lսy:`#^ԎSڞvbKȂUjF')?USV`IóhÅv%vBLYHo\ѵ>YrX9tfre"KP=ݼi!E<όxf)4: B*L5:Rc`gF9?l)[n| 쀊"˦b^aj3-8h`B2$s$d&=YCHJ_EsqSs1#+RTFR$^[ڃmv̝*mt\{x!^0*/ێWe'1RB˜eKR!pnt4\*r$ɍ-E|DȎbkΈЦ 6ךLeq3.D⽉'a2}ɾA~%(9dra/{BQ]qe bPi's> T9R#ƌ|k>1CFg,Ҧs`Dg5ӿmWJbDTBsU=?>:(ϣEwZs(ȇ~w @&e&Cn|gOc&ֺ<;F P]wtUd҈ψ%qTFi1T0Ѧ9=IiygDf~]sO8>v)Rƙq9nSJwFvq%)Gwf,Olq3|ⶌUECx8^睰HkQC51Ljh3ͤ6i3iVNHk僡LaѠhaaViW٩A#H~o7d!ϗ_%oo|Eaĺ1Dc~[Dzv⋳g&47a4GͯR~FQ&3I]fDŽVmE~RYS\pK} aCp8&>uRo6u38zbcר6S׃oMUÌcP5IlMsCFnP] h"Sy`@=C%:Uq|\yxwv{}~rӈCXYiDJ2 6fT.)1K"||ihǮǃSo۷CO9fs;!.8N?{ܶ_Am֠`*k+ l)R!)?$A R*toz2 =|x#u/QyIGdO;h'UUW+op,&=p]ِmdgK4c4ӑ zo.}7sVZ7Q˗|}o `~HQٹtYj.oI#e>^=Œ,Y_ŋ"\ҿ2,[&R{^=w ߁8>XZz9y8 ]!}+@&aoՒݗņ+M5}` U*K6ԦH}o=zx^a/^{,k/C܋u۰VkU.8 ̡=a՞jOX'V U{ª=a՞jOX'V 3Wr =a՞`O]$mE]zz] 0ۊk+ڊk+ڊ{*ZumEe~EYEiO>ӡ{Rw5҈NaI2CLvT`wi:ɶNl$:ɶNl$:ɶNl$:ɶN<\##m'N2>Xu]Iɚ{p94+1Vk! b)B&e1J•e*44e$A@1"kEd,rymUL*~8gd N cat F:~MN3f$/!YlQ+Le[L]E& S:`XMO3oI4'D!y99PԥOp*T; ko*a!gA%b0"?;͒ck2bONSⶫTzxO {x~J*`s`S/FLi=ĩ2:}Cey7޺ m˫*VqiMs#h9=j D{UH\7 1/`fA<<ӸA xB\ D0-_iNj;ߜ7yW՘9cuy5?T,ûfx4|?F{8z}?8f>5߽"Έ^$W/ODl#cL9TLBFIL=07+{3L{;6F,5GV#xiSwL[w$`07G0}7 kH6lU츟' chb82q˜8e R%X %iB$#Z(|w?-fpw;7s/'fjc[*J$o,džV B81e`ɑ!nEZX0a$D'qW2Evcc0!lbJ#u"HREiHQK1fr FVkܱ+jvcml!JSc7LK"ǘIAQKd4W6I)-Qu0i$Q>] +\QOR?V.{㛧͂Wjo,1ꂯIH&j7s~8J.}%u9{]{SMZ$'.cXpDfohӏ~2#K5GM8k{9HrǛD"P&Y,paۄY" G&ԕO1Fy.'6͞L'f۴ok?|knV$JVx\M [bƘ%wNJR mto>0=BڡM.4 [p{vGi},s7-,oH|_GΫW#V^s Ea٧5p&0f+t,~Ibɩ%:O G!K F"p3fa6d)Zђ}k1*F4GKh%'f;${বp#r2h_sƓ Ru^yR(5 $bI t$xSSE9xa#R*R= >.V:jB͝ȓϮu:KnDEyq>E~Ա-`ttzR[I7׮Ke~_w[)isڝc.ku%|hH,߼4,yii4đf0Ӈn|1n|֛'~| PָwZfG+;=R_tM̒^ٜ]%EYp|}VUQl1FP(R 4q-O9R܂(௶P]]):_##\pK/q27O+7 Ϧ&LoIĉ53m Sݦ"E\&T_N59 Ll偷m_8׃ѼmFA"udoEnuSqw$͡Tm҈7%H`FB6^Bwor4j 9 $^ᮠtCPcѻvUqH~ Q el6jA]r.ArE!5jHT3[:Όh@|Pn\`,vde^]aEE.G}4N>5Ԛ`}*H{./;< ʯZ:|*|jwar+ \_Ffu~f`s9ښ,8V^Ud}BC(d?h*/3' ܃fyM_g?+oM[\ rwINQ7y,Z>~M}KJ8aϨ"h} jLr)v!OE{mƥ>V>SU!.oL)v!OEWx{1һWB"lԷDn:gMbS{=DS>Rq, ߭\RDiεzI %iq"*20oW404WS, U2v1rdiYHhIwrt3v~sqg!{''C]ធZ佽.hLu|E_qLvz8JxdAdk+['O;3ɑE])<&JP0,IYC >Kua `M&7i2&H5-4:](pj VFkS/. \I$ޟv ~|ReJ㾙< pг(ӑ zo.}u_o#s} ~( `t%ŷgW?_u`K b~wκ%y~ z3 x5n:x JAG!hR'kfြCEHb,("a,4e%y=zTEsD$0z#n,DȎh#6ȚTT*BE<( MWm:pyx ~g]t7BA>bqe7c귛1usH3|;GXh<#Ն)&Pe43d)f6Uo96pj )Y(@5pL䡺\)ƞ{̻Ztǂb(؀3O<6JİU #rRS'cn|t>Bn9y:72VKDc[(I֟v߽_9gC!"ay?s}4@<$yuqQ\ů`u~9ܺ\Rm_z<\ )`T`|!-lҊHANi!rɇ koA)NWu>7Sqiנ =M02ssV_ZJ~E "5 Qwu4^a%J Es~õ⮃r&1vViϽ8^;~"uqcvP٬^;K^;ՍSq=;SBIwG'sNIʹZjD_Pu57$Wǡk6jNU)M)?\mf5s("!C}NT;7%**@Gu%cJV޵>m+ED4䃏6vڹsрֶ\Iv~ 5LK|.vRbB-9RƖH >`d $ Қ4k"3f}QG6UfbY*z#\{! V+7Z4b\n(mV 7 !\DkT q_wF!Gt꾣vۡeyZ9mZOEnMH+Z2,EFKͷvjKtjvۡAܞV{ڭ y"LIb&a͂~ }CO% &2ue9VmԒ.ʥ*olӊbBR{þ`?2||}}C&N?|iv/+̍4?c~eUb?j컬Fi5R-z3< $0_Ɵ]eZ1m|R3xй|rp|v]҇a2 9~t/.߽qwYRн8.b6g p??ԗa~ʵ@0P$('(YX P` P` lϠ*3@( {BaYsS±꬧L}cNM>I'z;R ЫWC5"2hpT8ᑄ TxP{>]Q \ѕ V)ɩ^}O3?V%(jڭh?i52̠7\2:4Q %ӜL"v;:xe:6+$BJV2U))Oa Kܛ2ŒrgSXܨQEJz1eX–>wlma@61 X0~R%Q/!*s,9PJԏ~9kPg6L U %*7Å? ?9 Tw?1ܟ˨,jd M:a{_;K9L27hxӳCڎ1YE:ˮˮˮnY.$^4*`3PXIE 50eIc`R%CfC!%Ln6y(OMf}Q"A81Q}6'=]љ+l`exx ,Gx#y`^tR* **UJ9^Yxd9%h(ܷ$LQH]>/0Ɣ(eYvi~q=ƵԉJBXa `0 FB Ěb4R4VqBc!eiq*; Q\k*v77.n`Z<|ybdkz4ZʹkzKO4DKubAm'-'ܓ<”R߹:nOUH1 M!z3g^|lCQ"߀ :b)R)HWcONZU0$)NP}"}KZ0 GTJIe)X8MPHH XʹL#[rTDq%𜮇݁4~(0 Tb=aR0K0K0K0k T2(YBe>䞒# b:{4c'PPD#"qsӛ=ǡFl8PqbÐL)H6)Bpd03Rl$!PbpR!;$tRewPFN ZwD$=Y!# g:[{[W.J9bWUl}rsvXfX1fOK*luug4<:Wv6gɟv;X\t;֭?;?lP MY3iq8*}>tߟٟg'bl?u+&ݱ_Tݛ~V\ٮի rf囁<ð=T0Oi\,]y[]nq&5c{"Q1u#C?zYGC%VƏ/*wT嫃 }d~X~Xn4ߙo]:9M3O]ړaopܘUq5162d4!1Ֆ6%U;Ub @@8F%G ق!p̂-{9O0$pN'XCwos`Ob&d@HۤVb=',ݐRwCnH 2u 8 )!uwy $"CRHTNT&]R"$*+D "t]!6v]ɶPHBar('Pr$Cڦ~g|;ł!M!JvEBjrs{w@Q nFb&!i׿Ͷ.8ZD!phYԘby50i~̈́_Z'.4=òmE$6GMrR~pXpۉ֠%# :?xn3vyك$ht3Fy4֧2:o|AbK%'=W"c]`GDZ7%?8d4s{hXs'U@fx?t0r{8YYQQqYnw3X`#AicJ"Q[ԚTbR^o39jfy:୫ AmٹFyD1 Ӑۄ X* D{oG[CWbtfx!L79`q[KC,L$:JCX 8NiLXدu@4 v@KXf~gVw] :^Ԙ=*XLjŊj7X ci!- a=aR(^/^LQMy9"D$ aȵ;+S]?K Ac-[Dn_I䷗Vz&9}#]36hr02W&=UcNoZDZjQ%T9xʟ]OgprB:Zmro r>YۗPneyk3* ;8[.Ո8E;-t5,AMu|) &pLՙD|S9Ak22u"짝;u] ykQaR/_v>wf,p|xvrYJ~[I1T;s:޺dg0d:C;{$,%/a_xnpT"WKA pDZ7j8zW(Uho]AF#dVi^ܗwwYnb˟'hXGh\/s]P(CB3a$/Xmx!x~]=D!x) uJIl{S!u'ЙL^=Ej1LZRفs#b?-U}98),ƞőp}b;wC/]7og88ӦYBdkk[[&v[#avDdkIG+1N:yF)򅤃j$gls hޏuoG셧5'Hv =&%n(mf]­L^QmTEzūc~*%f2Sbp"WIN[a4Tp a%/:ɉPNRN_-1"AʞI"@O]ܪɽ1H)8;w;Wŝ܁Ɲ`Z|r!$*q?0QR($10ƒT3͡iM FJAw4vinf2ꦗQQ!پNFQ*' WRoMS+ wp$%C]Z8Si@M1X{i5 1jI@i;WEY5˛sUPG" ÍkzөU6 >GޟcZpf}dhO6?_ёY`ڭc0p&G.xv,|WSnR`5#!\DȔrwF -k4諸vhXr5V?S5!!\Dp-EV;nui#:uQEҋPiV{ڭ y":'SU&X9\ H'SI Wvq[}Y<~u3n_@-L.Zª}Q -Z̜Au=ր4ؑެl(:j3t6֚U7[~nFm3{Q|W&DsEPrgL`kyt81R)fbmJN}AELS@S2Mll""lS(OsA#H0'B{^.!9Ƅ,/?#eC'c[~ CxK;3!cGX"G8wSl&aLI:pF\yZrIʦ;vJܠy麆@HJP6gfE,sϿ5b@C-*4D~NH4=HTCϒC6R' bX*ԄX/G4qMPv;9jn_r7/yPjЎ;dl?RIAgNJ_gln I{禹J?c)xHƔ&eDz# 7Xm"c)llLihˆiW’$]>-9m/bݲh턻.\%i].}8*G!vԉ[xf-)2v$nٯ$9{ 3c]{ v_kGܓ0dnvK9"\mպDŽ܁s3#e^\ d4'`o0^U5F*z\]o@Gse] bR!kuʺǠpL.\I)H.*)N ]u$J+nuwDj9cʺR0wC! 8W>j}TJUNW6T0ץ-` I+X/9@€CXF*hp|\hNY;fjp$~Ar [)!&W3ɰ6q.hsWNؠ4Xx'zYH; X. Fʹ={3w/-苮>I7f*h7S-bMQwqdťsNoc@ŁJa͝ U1Zo`{q׳o`+TPk㉃}ĐdVeV@T`hYǖDK ^VVɌTivkoՒlЫѵ4WrZP!q*2)D$ Dvc&yIݭB} ̂AAtȑSh%>$\HлkpB p+MbO*vp2I,@ǽw(q5,GMd+[">c@hkYxI&O[-(jIZɇB Pj?ʓ+X^$V~(X1؇Bl!p[ :wK8ό,+j>gW"1u\D ;\"pY"@qj ~;ywn{'+8~:}= t/^)^:~6~ Wc"0,zNCD]> 7=&?ٵbлĨѫ??zmǫ*t쫌nnf!|u 0o*.\Ư,nQ1o|]HdmOчAnFp\'_^ܼ.@gρ/տ~heQ\zxVb9e~b}G_A2OQ$2/7>Q0=oZDe%>=|8ƞy2L0 qἨZ 烯.\~g΍]~|}lè=}[ɒOه<x:M]̙QoK*⼸ ^ ~nV &gp_>\p)L굫sBw/ VKy=i\ ].7?nv2RSulqʒW^u-ȽLj8xF㛔SUкt@Q߰D^G_@7izi$^ZS"D7#/Qawށț4y3GLJ8KCk"Xuqhvqhvqhvq4C8t=.q(qsPIš8]}} D(#"fb[SJU1"I[Zhi#X=asMTu]E]E]E_Z\_Wus=GS)A}M#.㎞)K7e}t5Q޼%[%"`j1J~OUqF竇:\(fMd8iCP9k <0\R B#[h>"N0Gá&VOq}z'ʩs,Xgr $:gxsmx "R h+QX1ln'dn!t=na_tlrob7~F~IU\˘`D#l >"9DڮBMYNzxgbr+E+qъ'V<%%QLVV<{,uZbⴝDN4]Q?as*.,D>NZ`M({ks*`zm!Lzh橳Y,⥌]d aS&\ hLe~^ɢEmGLkECT"ꩈDTQCT:DQCT0DDu  \x5)݉G|Z'R.ukHH;ę`A{#RaʤD+\._Z$R]HEtQtQM&sp.Q,SzCROmumuMIܶr׶,Cn·5_uY 9Va&G"t;wq5e"ղ5ClM<&ݦ‚ϩUbRyʘ6[<5G=abuU4-"A$"G &oaATNޏZ,1bSq;ӈ.: -Hi;B,cΨ{R$(q^~{grZ赀X mP nxy 4R<907 $Pz4B'."Թyv^~n`:4"h[0 WFju vBݼj2!d _4fv&r/Dmu%Ebb-b|Yd'H Vfa1IfÙgS]`dk^|s)oوxVw<0 Z}"cG/bן84h>N'&Y^ivƪ%9V t3ۓ4}Q?M_8qVUoHq Agrl HI 0́9JI1Vb{޶e=m%!r(nz$i.z`wtȒ$I},%KRGDRyO@yx;GBU bzBիJ HJKr"Y%1+du!6б_V bL=_}jV;΢zx[V{1[K~DRbDV>up6OM3]u'f0/MQM1E Ժ.2q8;h ~ XJ"E"ubG戕U\=H| ZHG ʳ[\h1"D"k8܆){M<,J&1s2I,}R\rqKT ŚTE纐H$PSZl-1vp=wlOZAJAS3%X%.%I!dA%":2SXqj-/DVnLN?4~*TաF:K``&7=zB1 ^L},ߡ;Ōp3X6)|QwniRV+t Qoz0sr~}BbKw-X63XTǏ(y(\'ͼmlk܇-6fy * X#T]6lq) r*2y5t󼸥߯4`\35s~9{.uSZ-co# V66D媝/LAͭ FdtGo϶31L$z7ԈtSGB u8\#XRN)2L&4DXrgFg?%YnDHkC-"˩*DFCz椄V/HonkZ&@; w[/W:Ϥ4iP-rzz*V'O?Jz S7j>EM~.'V_&_N+ i-ћge|,-i@2G -T u&"H#2eך~6&"DdLi"2MDIGdJ `Ddz,8&8,޷}GNp~2ڜ$T F`Sn 3*`xYeSD*N@z@V #tIBmrX<4'nd['+}ntm]]ajw;۟n9B/6|{e%<\#`>܁@;|!Si?7vC~J5P$38#[3!yFH"0.IVM)6xj|6ZS-^劘D=B͔mM,k"m+2nnx-c %/)e/0a3%FE+u&tnq(`NCB@h)Ԁ8(WGa]2ާHzS±fn9":)2k-f˱rl/rl92(Mxxé,Xp*SȜb%n ,, =Ycf6=,3GR*BÓfy 1*:NۃqIX#uYHE u㚪邴8_-6UZEվkhGbI(Dve*5kyBI;7~Y2r6zk}trĴu9>I%&<3 6`_^WtxZhl~x8VV#Je}.0EYk$EƩ85nĠԷʟw̋"ޅByĘm,oD |nCh$@]TNᘑBiH.1!hQ`uZ*2'cJE:JBa)Io.ޝU{.p8?ĐR:"לB7th=DJxR$!65<H8Nhd9 :sDe7vd KY=o{EZξ)d{k&Zm9@ u>ls/Nqu56`NMzi֎(\@" G$?h_gFapy$-^sLsGC9T[ƵɯYϯ[ hkQ} >ļĔ>$\pT932t8pÆt;6AL;7E.0{9p-~'||;˞& ^OrR~R7*LTvjLհSX[i' "!"H M"o{NDE"24M UD#v2wVQ؏Zax$=D υPА}Gwufv+z~a ~m-Wٖ"t8_{xkϒHZ.OFFj2"LU$ʺ5m{Stk&%vq vVRJThQqy#p;oBa8Eл f%Io|j X6 l".2 nEW` ,}"іw< rGʸ5ҍџhQ=ZoZ, :PFzOiN` Ľh{=od<8T칊INIh(tYCD9잝bBA&j ?F%Pl$6M IW)(cĦ\+˰W)2'q8cDMhTP\~TOݠk*k#.\a3Enf1Rn\ԡ(^\bQ KxGQ\% G`Q+KmV5ymO*{r=TBHk8AI} ߱;m7:D0^!DSo38`3 4H4RXR@l_<+nW=<泎UsV:QN0ԃkʂ][CO@[ߣSY!{li(5S0c .>QA21+xqZ27nGy;297B@CDwFdwތ_b#X5ī˩|;.'R0ix{N`WwZA~}ˌ[?_MB-7,[)Sq3QtZ) ꎕ$L%>y9 pA{C0:JET*HDTF½.C=0 m.ԥY$X7XJ24K$tfSq.Ea$8mnso,8wS)箿8'8wG4ns?%r ~kT@ۺuT Qn=R'um(@d!!88$:cc,8jV:+)Nz<юҽ\jD+ +2>]+s 21[*M0-KTLcvZ J3\ B"[%K.Y%OS2^/\"pp:X I9eю`XO-(bC("XHIeYb-3лiPE LBW1eR UX e}AWN9կ ,x:К6Қ`U_'\Ǵ&=l1Y;08nMrp${r,93I?dKPwjY-ucXD (ŔF(lx)} .Jz^B-(b'J̜y46koeh?ߙ*rdV)K d7ou[e },cr; ,eHտ/f$E4EءM:׵[x'DJI]v;y*Hg-}kvkBB\Dd*\nkRi#:kn"$3햌hvkBB\D)I@hYC9äa)6̬#XJf;8Jݝͽ{~?=~~>fc>*/}^QW^w@9C8E6HcA,iA겝zt]^/0)zh{-~ydCNF>Dw8x@w]5. h 0?ՠimgﶕD`F^JGJȏ0$'4z\=Ym;rQwV\[J]m"]<{s.nU()Q\ԨWX hS^8͜LҞiyKujh ݬmUHē\MPפaw~j=LBOL[Q,CM,9 wC%qrwf{Z(mvZjvz+ t2 }!F9J^Z] ™H5jpac̻KL6 cS83$˚v"31Fv6d9)#EHoN +R&v .)JcvG) Lpb;pQ`;Q :'M@CP6p3S*asOQ8@a[Mq`7ቬ+9A{d7fc}-^O;'o+V /¡28i4@Uf;o#]͋c[giQPplդTQ,ڲk`{*zgkeSՀ rd<**ș?>nvpp-`87Ry#Df԰/h%D ~I탱S!m;J ^@uA ʼT\$;(x! JMw$u~UX/0MƝwStW)ԍ4]zZ16[ğF^Pm/?wt8k^/okݫBϣpg]Yi`+1px6zàA[wc^oo5O{ogwO53ܯ}xc2GG/0BK?ojW_TXPG No_nF/fUϏ~bMON)?NBNy휶o>ݏjq*,ߌO}*oo6cqroɵlh'=e.;,Jy[7l/ {+Ӡx'שּׁn$t ['NPT?7RC?g(b9z rebڝa1R"\r˙9KUhNQ(h8ߤP؊swwgd:&2$ 4 b}_~\R`\5ש@#o)gaFii;͊IbuBW:\d:G1PY ,7N( Aq?Nma3B/t 3"4B\4rQ$|8OxJ9^421ZUpB A!0b }R~N<Cϐ3 )?Cϐ3eO-yhHR~6CXdlU)9x.`j.1gZ3 Cn%5JWnEDV8,eig[Xh=prVAQhGYtDo~v,9!;AXW8BF n)(9&Zh0Rh7P*ə`-q!tWDC0;vA&O+$ ʯ{b,_S{=w1s{.*Z]=393׬-N+ o2 v Q$80,>0Ur'L!r*yɑB.3D.~̮c_WhgY*H]Y!LoZeYxi BR&y[$*y2E=A.O~P*Bk)/ ZvߜOwTpAux7`'\*@0¤Y>c֘\Tնvנ;>F~ -!S ɌiA3sI +m )6-pH0@b^9Mu~4/&{Qڲ2II$X[XÆ uvGGX4ְ+@QWa<@"9AFhYu㈚ z/Ƙz50lTbnٖ5(Ɗ{|ea]3*[#aEK{MTho tTAIK6í5_χiE46Ls/YI n8zYF(( Ƕj+/;WjHo|\mfQqh2qZ6`D#~Fs:uXyxƻ7= Y*ɥ)%r a?MJs^PkGpkT%sBI̩5(qo(9}L?cN"L2r[%Ā2EaAa!f6"_Ei'kN.IN>̙FmɗCLP4:UODTH1I-w4MGs*4"T ԦD1kKd2pIKMKF>iaj Rt[kۆNeRZLs&w ̽21kmmH6-rQg#wv7kP;u>T؞pmM)5luSw:`gQJm,)Hu Y0g]DN7:eɴ! 3gHw2kr37e"7mzmy#*E_:  0|MZ|FUc`R❳*W?-v""sAx߀޹|4x7Ǘ7_`Ȉ XI"\S#Lm}zԈ*&{xN??Q;,r"&y3eM-u}t75>&[8C*`i`eˬVipZȵ\ƖyfJՆ6̭a3JY*U}" WcU]֫6=Äa0jp5L&\ bUM* WÄ cD`2BqN%#/A0j=,7Rg?bv[xEqqT,J7 # dQa1X8ZɮH(8j;CK;Q/ EQH $x$CYN^#A@.ft uҒPk?qf\Ndc 8Q^f:Me z1d JcU`A p/r^3h\~g"8ᒾ` `' N#\gtheeBZo)>f?z1sfQ{'G'{iL,j'M}֋2H8$֡ǒK瞅r`ID^P+ r:{s Ea<8~~z햃2%,l. ̵o4PaLƒY1W髗m 8CɃ> RE3AZHRFY`V3)5%*Dڳr:e9>imuv` dП:N}qL~?Y4˓_ !47lR`o^b>27_B8 ܽJ'LJ|ŹXG\޶z% ߵO`3Dc}hd+ ^|f0ܷGgWG&Xuf*;7Qԑn {u׸/+Dơ uώ_shz3A:W|7,'hE'"$|T95=K9}H^OƜ;;A6 T'X3ZSNɝy*Mϩ<+~ N_w z,-w*kݠB 5rދ3{ss*~KiDɳeE>=9qZlt}|> O7ܣE(ٿ'~wXJ~= -@<}Cғg.uKS'<ƀמ5PƂ/lA)Q J)lqĸ3| )K3)|ʪnޘmpkKJ )8*roϘ>"qZ s$!#nN3!b+ G>:p*CHX" X^02ϙJ:5OvDވ+U%>]w;])8He4[lV,2oZu*ū'Nd0xxƓ4$n{'+t07jLz-&xixƦ2xF)R?5ZfilK[! }4M0MgdZ:miZM$ߘ])DngD@:)2ƀeGp'Rq l@ S!Η+4(P*'Q&WێhvCOfxшloaFo|G@\WonD/77$m>(\iD++;턿&o?{pu^40QЕA_ecWSs^|+\Ϋ`)_aGhoA ")QPKDJG#'l1MD#f#RI-W,Ýy(OL w$ m9BlSVmbVNJ+ 6I452p;grx`vB20Ka8MELt) :ʘ :j,()6K@ݺF|3!.D;$r0$АY xL< Ga wR˝DR8kRX  `fG\I\J]vP]2ޒѯ!f%9Rb C7ES<)7:6Р:.Lr*`u>K\.w:r0\$D>0'$ ],1+8IP,B>Iugva]sڴd !g[c+ckbX`8Pkai &DAPw@iWB77vBzs;𳏹uzFje wЩKx}tp8[u8}_R@04 ;;(8c6~9? Gdgx|trG,$oBDF G?jx|$Ko'%jbF L`~^ k(/Iqh"&.ggn6$޶Xܹ>TW6A+(zb&ɢ^Ƈ_5Rk#ɧ }˭s:Q^l"OuN9-GwDa*눒Rn(LǤ 9d.jO0j5I2"!0OwJF%2ֆZk@(DFf4s)׆[.f]D2^-F9!}fA//3n{38 1"wZKL_꘽l(lɓ*klz.8^ f󢑍R/HcjiWk$p{x7++]R۶Nk9WqHB!6?Ϛ}o*p/ 299>[i<|%p^9<׊BncqqIQM1R띋RC^`|^) +_ UܗIC,eU b!Kr(ITlƎxC҉(j=~&'H5b c¨B[C1VDeՔ"wǿ>=.?2)8Q2X[@! 1BB̮H ̤~,f"IRUh <\5$\hg:s|DOᬙf\w]Z'4<m]ٛw6&Bwd ]K  G[s25bU9ϴ: ߧ9zvN7$ub BeyhZNiQH I MQx!X=k:\{ܭzzI I Mg qnZ* ܭ*ʐNw,ݞRЊq-VA!8>4fyֳEMFpu~FF-ueF;iVR &R/Y-&FV*6H bR'sM11)P|IHFZ2H-+$l|})A:Ker3R{TYsJ!&CAވ$~=ubcaFi2-x8̨0r 3Z Ȫf+Gb<kCJ\6p%cE+ˬf>lt,f%1܄58NH@D+CbIilPɍp$@ \rU։gPVPQwyét5*/ԣ'+̳1ռmTHKeU:T@!HM X+4ARe}91K䑢j)*d%JݩJT jV s%wj+5BIxnӲkyY3n{9pލ@ȟT닞߾ Sez: ǭHѳɓ@j~<R6EQ4oCXdkeRO$Lwʖ.S4X4Ҕ'(.+/,Tua%;τ $V Z8 E*m[I]%~X8XI S )_ u TՐWY,}> uBgPY,}> uBg"W2F ,}JxɜR3Ao5%iqe@Ru `z2mS?H9|U/C?bC*":՟?p#ㅍXl|U"00DGR 5y .4JD?zmI u&@ǑƚG8V"ϜvwRuZ!Q3k;P)mtCݺT9{-aiI*zl{op)ΐ cS ),sJHvF DGԬ$$o7ND┵X38xU [a8)#dFzrMTU[;o}Woe3o*Nyv(FICxN/Å/Cbw4#}:@bW_eowo߾jz}>dlMm" [k%k]c7Pckś͠6FMgH&:o!w1qK #& u;)9P(a^RgM4filK[ؖ_%6MaT0jj:fm\[0#(.i450ukjڈ J||q.Hq[!.ȧ];48I2(Qc bmaVr;Xw47sk~|h'p&pD,6qb"bU:Pp-S8BɄ D}aHoB(1\F"iT".-$nnPKϺ5;k4O'BOY})agB,)$LBbII>,d"gO'6OK15c vLwpeQ́)J) j%KI""1 C4R7V "eF*&j/ONʱ-*x, $iͶX~)>Y u/s(%Q(\,Qx$.NTl32(]X+ ~Fc,d" (5"E5fVv׀~j?qE<3,2pYǫ)ZG'M*(R-:/)Ig@Xf>5*+P "Q>o2 F/pZukoj_E{Fߏdx8Y*(dwvՒZdӲ1l,ʤ [yts# Ь{DCu`IECEୠce6@ §3)ItG>g9f=/S%GHkyA` _ h8lc xGuڦE_:dpX.uz¾dkOFtc-^ZϵR׃0m0^>z}}xQ*̖&ک WZ /M3;97郞7$-;L߹ˬweLu}Y+n<2 7AYD8MvɦiĽ2VהasOHo$kHδ|.Q OwX#9EnwPzX*0~0m"N ;!Q\sՔ άX9.0UmTp;qS}Suk Je>E~ڰؗ^# %gCպYn8H<qrrψ= !8!hq8\O"qI_3b F"(Bm$\Jr8I;^&˱9)GH'U0㲾CXMqH)OSjh8,ĢD("8UAE #; ,QuGaY! ˩S-?%0ثe\Jtq n<,~wC\w+d b*]iLd+[r?Uoy?ën /0t2%}#Q\d7>4S5/lj9U,̝(2#{N2|Ω}j3 Wb X| :NFN7^<Y=8#ҝ1sĐK#fD\x\44ښ4PfXe雂U }s>iuzobOIrΛձ!:_jh,d)$ ati ;$+vAх ^]X. niGSr<9+"I0n%A`̯},0'_ 'i;0Qa'XQi ϊ B.qO2Cԑ֒koHơ*v%X.m!Q+0a۳T{*jcM%ZWݹ2IoL8,%m)TUvUv|N}m38#CF IyaH+Yɬl - ,FT Ymfb.VRX`b74q&(>.xv!xbBJ*DVe?uk #d2|7gٱl6fᑎl)qd?k 1wl-:~3t:plI7Ւ%/mLi3n>}4Sw;_x{q6 v>[ʳ3{]!S0k* >4jpcyw̩21י~%':{P^iHTcJ f,%SHF,bME+EX8!|my4 4EuE~9X :@ve,ZBfĞ30绰V LIsݛa0N&t'}FPW1N<VJ0Mǭ2Dnˀ?zwRvij\KYgusv6̒B3zy~ *mcr;ou{pFMj[Rt`ϞN}F4̔:Ab*?݂5MܪC ;w9^~^ylF?~WTP  NpT AcsΊWˊ9<%1eLl9z6٢Kjꈙ =5u#2!zSJ5bƼ+ve; ! 3 %PzN⵬'ABGѰvo}T\SиQ73jZ5v9۫5֡\kZ;$n`5 hIM_pZR=x*gcnS"kmJ/Ϸ(mQFy4۔Qw%涘Ca-+@FyjxFYJ $FH.I<gƣ1!λK=X.?0sWcJy@9v_+ԣf7^3Eys;Ύ"$rWEuyb{r4a"~xe/tznLuZ!oԽOEjAk%5c!DGH(}0]w6j:\y3*DjaUɶuU=vCnТ',A'}8nF>F>F>E\|C<NzLd!fv iD$ue9<4HXKYYO>=E_NT]i:i(gꍍ0\-;L^(и1=TG^LuY.M OID)ʴG:g] ZC|xlYFچӣs{.ǪGtEܢ}98!=:g9]Yb8A'^) ͩ#ͩi3VgSè1QLꌎTgu&:Cq3:PvsZܜƸakAK })GD/#t(%!X7IM#İT82"*E ec!JR:$t3~#TcrBn7LU,6m VKQ} tB_l$`NDgvj\`:슃OZńBʨ-Խn:8#(k8qN$bzJ'^+E$̓ƫk88T-e78\ Qk !]M8֛PZ * Ŭz4fHE<Ç\  XKL8 (} a"j8R>8u,!%;%r䭽nOd$sL_1c1ג$4@khIv?kɆZ;51CEy_Qݍzf͊Օ.$+?0WTqzFjNJZTWT6rˆKQ?Xʭi[M S;^m Uy3SB0H&)l*AX0I )f&mDjWRSݼg$ ݣwvr!C uv"yGwtGc[ ^Kv@KUbp c^K6֒:ҜZ$׽UKC jvar/home/core/zuul-output/logs/kubelet.log0000644000000000000000003251442415150357064017710 0ustar rootrootFeb 27 17:34:58 crc systemd[1]: Starting Kubernetes Kubelet... Feb 27 17:34:59 crc restorecon[4681]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 17:34:59 crc restorecon[4681]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 27 17:34:59 crc restorecon[4681]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 27 17:35:00 crc kubenswrapper[4752]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 27 17:35:00 crc kubenswrapper[4752]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 27 17:35:00 crc kubenswrapper[4752]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 27 17:35:00 crc kubenswrapper[4752]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 27 17:35:00 crc kubenswrapper[4752]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 27 17:35:00 crc kubenswrapper[4752]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.612402 4752 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.619650 4752 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.619690 4752 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.619703 4752 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.619715 4752 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.619725 4752 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.619734 4752 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.619741 4752 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.619750 4752 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.619757 4752 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.619765 4752 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.619773 4752 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.619780 4752 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.619788 4752 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.619795 4752 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.619803 4752 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.619810 4752 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.619819 4752 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.619829 4752 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.619854 4752 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.619865 4752 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.619874 4752 feature_gate.go:330] unrecognized feature gate: Example Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.619925 4752 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.619934 4752 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.619943 4752 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.619952 4752 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.619959 4752 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.619967 4752 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.619975 4752 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.619983 4752 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.619991 4752 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.619998 4752 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.620006 4752 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.620014 4752 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.620021 4752 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.620029 4752 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.620037 4752 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.620044 4752 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.620052 4752 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.620060 4752 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.620067 4752 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.620075 4752 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.620082 4752 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.620090 4752 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.620097 4752 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.620107 4752 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.620120 4752 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.620130 4752 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.620138 4752 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.620173 4752 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.620183 4752 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.620191 4752 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.620199 4752 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.620208 4752 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.620216 4752 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.620226 4752 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.620234 4752 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.620243 4752 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.620251 4752 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.620259 4752 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.620267 4752 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.620275 4752 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.620282 4752 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.620290 4752 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.620299 4752 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.620307 4752 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.620315 4752 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.620322 4752 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.620346 4752 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.620356 4752 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.620365 4752 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.620374 4752 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.620579 4752 flags.go:64] FLAG: --address="0.0.0.0" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.620596 4752 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.620642 4752 flags.go:64] FLAG: --anonymous-auth="true" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.620654 4752 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.620666 4752 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.620674 4752 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.620686 4752 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.620700 4752 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.620709 4752 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.620718 4752 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.620728 4752 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.620738 4752 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.620747 4752 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.620755 4752 flags.go:64] FLAG: --cgroup-root="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.620765 4752 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.620776 4752 flags.go:64] FLAG: --client-ca-file="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.620785 4752 flags.go:64] FLAG: --cloud-config="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.620794 4752 flags.go:64] FLAG: --cloud-provider="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.620803 4752 flags.go:64] FLAG: --cluster-dns="[]" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.620825 4752 flags.go:64] FLAG: --cluster-domain="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.620837 4752 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.620849 4752 flags.go:64] FLAG: --config-dir="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.620860 4752 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.620872 4752 flags.go:64] FLAG: --container-log-max-files="5" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.620885 4752 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.620895 4752 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.620904 4752 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.620914 4752 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.620924 4752 flags.go:64] FLAG: --contention-profiling="false" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.620934 4752 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.620944 4752 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.620954 4752 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.620977 4752 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.620988 4752 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.620998 4752 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621007 4752 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621016 4752 flags.go:64] FLAG: --enable-load-reader="false" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621024 4752 flags.go:64] FLAG: --enable-server="true" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621033 4752 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621050 4752 flags.go:64] FLAG: --event-burst="100" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621060 4752 flags.go:64] FLAG: --event-qps="50" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621069 4752 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621078 4752 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621086 4752 flags.go:64] FLAG: --eviction-hard="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621097 4752 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621106 4752 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621115 4752 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621125 4752 flags.go:64] FLAG: --eviction-soft="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621134 4752 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621172 4752 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621183 4752 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621194 4752 flags.go:64] FLAG: --experimental-mounter-path="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621205 4752 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621216 4752 flags.go:64] FLAG: --fail-swap-on="true" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621227 4752 flags.go:64] FLAG: --feature-gates="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621241 4752 flags.go:64] FLAG: --file-check-frequency="20s" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621254 4752 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621263 4752 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621273 4752 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621282 4752 flags.go:64] FLAG: --healthz-port="10248" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621291 4752 flags.go:64] FLAG: --help="false" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621301 4752 flags.go:64] FLAG: --hostname-override="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621309 4752 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621319 4752 flags.go:64] FLAG: --http-check-frequency="20s" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621328 4752 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621337 4752 flags.go:64] FLAG: --image-credential-provider-config="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621346 4752 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621355 4752 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621379 4752 flags.go:64] FLAG: --image-service-endpoint="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621389 4752 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621398 4752 flags.go:64] FLAG: --kube-api-burst="100" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621407 4752 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621418 4752 flags.go:64] FLAG: --kube-api-qps="50" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621426 4752 flags.go:64] FLAG: --kube-reserved="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621435 4752 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621444 4752 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621471 4752 flags.go:64] FLAG: --kubelet-cgroups="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621482 4752 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621491 4752 flags.go:64] FLAG: --lock-file="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621499 4752 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621508 4752 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621518 4752 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621531 4752 flags.go:64] FLAG: --log-json-split-stream="false" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621540 4752 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621549 4752 flags.go:64] FLAG: --log-text-split-stream="false" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621557 4752 flags.go:64] FLAG: --logging-format="text" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621566 4752 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621576 4752 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621585 4752 flags.go:64] FLAG: --manifest-url="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621594 4752 flags.go:64] FLAG: --manifest-url-header="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621605 4752 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621614 4752 flags.go:64] FLAG: --max-open-files="1000000" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621625 4752 flags.go:64] FLAG: --max-pods="110" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621634 4752 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621643 4752 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621652 4752 flags.go:64] FLAG: --memory-manager-policy="None" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621661 4752 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621670 4752 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621679 4752 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621688 4752 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621708 4752 flags.go:64] FLAG: --node-status-max-images="50" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621717 4752 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621727 4752 flags.go:64] FLAG: --oom-score-adj="-999" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621736 4752 flags.go:64] FLAG: --pod-cidr="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621757 4752 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621776 4752 flags.go:64] FLAG: --pod-manifest-path="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621785 4752 flags.go:64] FLAG: --pod-max-pids="-1" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621794 4752 flags.go:64] FLAG: --pods-per-core="0" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621804 4752 flags.go:64] FLAG: --port="10250" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621814 4752 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621825 4752 flags.go:64] FLAG: --provider-id="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621836 4752 flags.go:64] FLAG: --qos-reserved="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621847 4752 flags.go:64] FLAG: --read-only-port="10255" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621859 4752 flags.go:64] FLAG: --register-node="true" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621870 4752 flags.go:64] FLAG: --register-schedulable="true" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621880 4752 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621895 4752 flags.go:64] FLAG: --registry-burst="10" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621904 4752 flags.go:64] FLAG: --registry-qps="5" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621914 4752 flags.go:64] FLAG: --reserved-cpus="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621923 4752 flags.go:64] FLAG: --reserved-memory="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621934 4752 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621943 4752 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621952 4752 flags.go:64] FLAG: --rotate-certificates="false" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621962 4752 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621970 4752 flags.go:64] FLAG: --runonce="false" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621979 4752 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621988 4752 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.621998 4752 flags.go:64] FLAG: --seccomp-default="false" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.622006 4752 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.622015 4752 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.622038 4752 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.622048 4752 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.622057 4752 flags.go:64] FLAG: --storage-driver-password="root" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.622065 4752 flags.go:64] FLAG: --storage-driver-secure="false" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.622075 4752 flags.go:64] FLAG: --storage-driver-table="stats" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.622084 4752 flags.go:64] FLAG: --storage-driver-user="root" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.622093 4752 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.622102 4752 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.622111 4752 flags.go:64] FLAG: --system-cgroups="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.622181 4752 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.622211 4752 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.622221 4752 flags.go:64] FLAG: --tls-cert-file="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.622230 4752 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.622247 4752 flags.go:64] FLAG: --tls-min-version="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.622256 4752 flags.go:64] FLAG: --tls-private-key-file="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.622265 4752 flags.go:64] FLAG: --topology-manager-policy="none" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.622274 4752 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.622283 4752 flags.go:64] FLAG: --topology-manager-scope="container" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.622292 4752 flags.go:64] FLAG: --v="2" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.622304 4752 flags.go:64] FLAG: --version="false" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.622315 4752 flags.go:64] FLAG: --vmodule="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.622325 4752 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.622334 4752 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622576 4752 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622587 4752 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622596 4752 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622605 4752 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622612 4752 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622620 4752 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622627 4752 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622635 4752 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622644 4752 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622652 4752 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622660 4752 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622667 4752 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622674 4752 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622682 4752 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622690 4752 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622697 4752 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622705 4752 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622715 4752 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622729 4752 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622737 4752 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622746 4752 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622755 4752 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622764 4752 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622776 4752 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622787 4752 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622799 4752 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622811 4752 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622822 4752 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622833 4752 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622844 4752 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622854 4752 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622863 4752 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622873 4752 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622882 4752 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622893 4752 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622903 4752 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622913 4752 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622922 4752 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622932 4752 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622941 4752 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622950 4752 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622960 4752 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622969 4752 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622976 4752 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622984 4752 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622992 4752 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.622999 4752 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.623007 4752 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.623014 4752 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.623048 4752 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.623093 4752 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.623104 4752 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.623114 4752 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.623126 4752 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.623135 4752 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.623174 4752 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.623182 4752 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.623190 4752 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.623201 4752 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.623213 4752 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.623222 4752 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.623229 4752 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.623237 4752 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.623244 4752 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.623252 4752 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.623260 4752 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.623268 4752 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.623275 4752 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.623285 4752 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.623295 4752 feature_gate.go:330] unrecognized feature gate: Example Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.623304 4752 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.623329 4752 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.637874 4752 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.637982 4752 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638389 4752 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638405 4752 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638415 4752 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638424 4752 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638432 4752 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638443 4752 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638453 4752 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638465 4752 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638476 4752 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638499 4752 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638512 4752 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638526 4752 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638536 4752 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638545 4752 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638553 4752 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638563 4752 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638572 4752 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638581 4752 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638590 4752 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638599 4752 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638609 4752 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638617 4752 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638634 4752 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638642 4752 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638650 4752 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638659 4752 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638668 4752 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638675 4752 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638684 4752 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638692 4752 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638700 4752 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638709 4752 feature_gate.go:330] unrecognized feature gate: Example Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638718 4752 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638726 4752 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638737 4752 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638753 4752 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638761 4752 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638771 4752 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638779 4752 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638787 4752 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638795 4752 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638803 4752 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638813 4752 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638823 4752 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638830 4752 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638838 4752 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638847 4752 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638863 4752 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638872 4752 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638880 4752 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638888 4752 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638898 4752 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638906 4752 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638915 4752 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638923 4752 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.638998 4752 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.639049 4752 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.639061 4752 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.639072 4752 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.639462 4752 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.639477 4752 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.639485 4752 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.639494 4752 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.639502 4752 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.639511 4752 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.639519 4752 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.639526 4752 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.639534 4752 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.639542 4752 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.639550 4752 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.639559 4752 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.639574 4752 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.639835 4752 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.639851 4752 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.639860 4752 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.639869 4752 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.639878 4752 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.639887 4752 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.639895 4752 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.639906 4752 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.639917 4752 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.639926 4752 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.639936 4752 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.639944 4752 feature_gate.go:330] unrecognized feature gate: Example Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.639953 4752 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.639961 4752 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.639969 4752 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.639977 4752 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.639988 4752 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.639995 4752 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640004 4752 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640011 4752 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640022 4752 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640031 4752 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640039 4752 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640047 4752 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640057 4752 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640065 4752 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640073 4752 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640082 4752 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640090 4752 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640098 4752 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640105 4752 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640113 4752 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640121 4752 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640128 4752 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640138 4752 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640175 4752 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640185 4752 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640194 4752 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640202 4752 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640210 4752 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640218 4752 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640226 4752 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640233 4752 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640241 4752 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640249 4752 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640256 4752 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640264 4752 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640272 4752 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640279 4752 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640287 4752 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640295 4752 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640303 4752 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640311 4752 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640319 4752 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640326 4752 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640334 4752 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640341 4752 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640352 4752 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640360 4752 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640368 4752 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640377 4752 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640385 4752 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640393 4752 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640400 4752 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640408 4752 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640416 4752 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640424 4752 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640434 4752 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640442 4752 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640449 4752 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.640457 4752 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.640470 4752 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.640771 4752 server.go:940] "Client rotation is on, will bootstrap in background" Feb 27 17:35:00 crc kubenswrapper[4752]: E0227 17:35:00.645786 4752 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.651520 4752 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.651698 4752 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.653563 4752 server.go:997] "Starting client certificate rotation" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.653614 4752 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.653903 4752 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.686207 4752 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.689354 4752 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 27 17:35:00 crc kubenswrapper[4752]: E0227 17:35:00.690104 4752 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.102:6443: connect: connection refused" logger="UnhandledError" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.712201 4752 log.go:25] "Validated CRI v1 runtime API" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.746295 4752 log.go:25] "Validated CRI v1 image API" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.748638 4752 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.756443 4752 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-27-17-30-39-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.756498 4752 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.787394 4752 manager.go:217] Machine: {Timestamp:2026-02-27 17:35:00.783702023 +0000 UTC m=+0.690518964 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:3997dbc0-568e-470a-afbe-a819259fb419 BootID:78085164-654a-4899-838b-cadb0192fc93 Filesystems:[{Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:e3:75:1b Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:e3:75:1b Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:5c:73:a9 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:d1:24:f7 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:18:65:d8 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:47:f7:70 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:26:2d:05:f6:f9:12 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:8e:85:47:29:65:bb Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.787843 4752 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.788081 4752 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.788797 4752 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.789118 4752 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.789216 4752 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.789607 4752 topology_manager.go:138] "Creating topology manager with none policy" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.789629 4752 container_manager_linux.go:303] "Creating device plugin manager" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.790184 4752 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.790569 4752 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.791303 4752 state_mem.go:36] "Initialized new in-memory state store" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.791455 4752 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.796243 4752 kubelet.go:418] "Attempting to sync node with API server" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.796283 4752 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.796419 4752 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.796451 4752 kubelet.go:324] "Adding apiserver pod source" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.796478 4752 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.804197 4752 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.804987 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.102:6443: connect: connection refused Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.805048 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.102:6443: connect: connection refused Feb 27 17:35:00 crc kubenswrapper[4752]: E0227 17:35:00.805201 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.102:6443: connect: connection refused" logger="UnhandledError" Feb 27 17:35:00 crc kubenswrapper[4752]: E0227 17:35:00.805214 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.102:6443: connect: connection refused" logger="UnhandledError" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.805376 4752 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.808984 4752 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.810695 4752 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.810741 4752 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.810759 4752 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.810775 4752 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.810799 4752 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.810814 4752 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.810829 4752 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.810854 4752 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.810870 4752 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.810886 4752 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.810905 4752 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.810921 4752 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.811834 4752 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.812633 4752 server.go:1280] "Started kubelet" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.812849 4752 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.812990 4752 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.814329 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.102:6443: connect: connection refused Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.814712 4752 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 27 17:35:00 crc systemd[1]: Started Kubernetes Kubelet. Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.825705 4752 server.go:460] "Adding debug handlers to kubelet server" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.827522 4752 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.827743 4752 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.828565 4752 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.828608 4752 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 27 17:35:00 crc kubenswrapper[4752]: E0227 17:35:00.828683 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.828807 4752 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 27 17:35:00 crc kubenswrapper[4752]: E0227 17:35:00.829035 4752 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.102:6443: connect: connection refused" interval="200ms" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.832237 4752 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.832273 4752 factory.go:55] Registering systemd factory Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.832290 4752 factory.go:221] Registration of the systemd container factory successfully Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.832298 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.102:6443: connect: connection refused Feb 27 17:35:00 crc kubenswrapper[4752]: E0227 17:35:00.832424 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.102:6443: connect: connection refused" logger="UnhandledError" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.832649 4752 factory.go:153] Registering CRI-O factory Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.832675 4752 factory.go:221] Registration of the crio container factory successfully Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.832708 4752 factory.go:103] Registering Raw factory Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.832731 4752 manager.go:1196] Started watching for new ooms in manager Feb 27 17:35:00 crc kubenswrapper[4752]: E0227 17:35:00.832080 4752 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.102:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.18982af8a703b7cc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:00.812584908 +0000 UTC m=+0.719401799,LastTimestamp:2026-02-27 17:35:00.812584908 +0000 UTC m=+0.719401799,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.833957 4752 manager.go:319] Starting recovery of all containers Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.845987 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.846077 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.846104 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.846125 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.846179 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.846202 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.846225 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.846247 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.846273 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.846293 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.846314 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.846338 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.846360 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.846390 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.846415 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.846445 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.846502 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.846525 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.846544 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.846564 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.846588 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.846611 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.846633 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.846658 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.846734 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.846765 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.846800 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.846834 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.846866 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.846902 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.846931 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.846963 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.846991 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.847022 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.847097 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.847121 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.847192 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.847220 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.847244 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.847279 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.847303 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.847325 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.847346 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.847368 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.847390 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.847472 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.847497 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.847519 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.847542 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.847563 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.847586 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.847610 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.847638 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.847701 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.847727 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.847749 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.847771 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.847793 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.847817 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.847838 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.847863 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.847898 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.847929 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.847958 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.847985 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848008 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848029 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848051 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848074 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848106 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848127 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848176 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848198 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848219 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848239 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848260 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848282 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848308 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848329 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848349 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848369 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848391 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848411 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848433 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848454 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848475 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848496 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848519 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848539 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848562 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848584 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848608 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848629 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848651 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848673 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848695 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848718 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848741 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848764 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848787 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848808 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848830 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848852 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848873 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848905 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848930 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848954 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.848976 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.849091 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.849119 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.849172 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.849197 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.849219 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.849244 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.849269 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.849295 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.849316 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.849340 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.849363 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.849385 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.849406 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.849430 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.849451 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.849471 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.849493 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.849523 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.849545 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.849615 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.849641 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.849661 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.849683 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.849705 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.849725 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.849747 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.849768 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.849791 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.849815 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.849836 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.849857 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.849881 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.849901 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.849926 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.849948 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.849970 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.849990 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.850011 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.850032 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.850053 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.850076 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.850099 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.850135 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.850199 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.850229 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.850252 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.850272 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.850294 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.850315 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.850339 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.850359 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.850382 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.852305 4752 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.852352 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.852387 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.852417 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.852449 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.852475 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.852495 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.852516 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.852538 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.852560 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.852581 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.852602 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.852623 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.852648 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.852671 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.852713 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.852735 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.852755 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.852777 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.852797 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.852818 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.852842 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.852863 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.852882 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.852901 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.852924 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.852945 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.852968 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.852989 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.853012 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.853033 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.853054 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.853075 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.853095 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.853116 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.853137 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.853192 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.853221 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.853250 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.853278 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.853307 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.853333 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.853360 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.853389 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.853419 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.853449 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.853478 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.853508 4752 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.853537 4752 reconstruct.go:97] "Volume reconstruction finished" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.853555 4752 reconciler.go:26] "Reconciler: start to sync state" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.865745 4752 manager.go:324] Recovery completed Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.887558 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.890383 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.890625 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.890763 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.892518 4752 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.892552 4752 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.892589 4752 state_mem.go:36] "Initialized new in-memory state store" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.901569 4752 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.905343 4752 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.905427 4752 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.905476 4752 kubelet.go:2335] "Starting kubelet main sync loop" Feb 27 17:35:00 crc kubenswrapper[4752]: E0227 17:35:00.905568 4752 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 27 17:35:00 crc kubenswrapper[4752]: W0227 17:35:00.908874 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.102:6443: connect: connection refused Feb 27 17:35:00 crc kubenswrapper[4752]: E0227 17:35:00.908985 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.102:6443: connect: connection refused" logger="UnhandledError" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.915204 4752 policy_none.go:49] "None policy: Start" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.916906 4752 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.916961 4752 state_mem.go:35] "Initializing new in-memory state store" Feb 27 17:35:00 crc kubenswrapper[4752]: E0227 17:35:00.933309 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.978620 4752 manager.go:334] "Starting Device Plugin manager" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.978734 4752 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.978758 4752 server.go:79] "Starting device plugin registration server" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.979426 4752 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.979459 4752 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.980221 4752 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.980358 4752 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 27 17:35:00 crc kubenswrapper[4752]: I0227 17:35:00.980380 4752 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 27 17:35:00 crc kubenswrapper[4752]: E0227 17:35:00.997021 4752 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.006408 4752 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.006588 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.008533 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.008587 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.008606 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.008817 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.009418 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.009527 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.009966 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.010001 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.010014 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.010127 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.010308 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.010374 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.011062 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.011100 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.011113 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.011218 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.011371 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.011422 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.011569 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.011613 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.011633 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.011644 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.011677 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.011694 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.012012 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.012042 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.012055 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.012178 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.012294 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.012351 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.013319 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.013345 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.013365 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.013391 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.013672 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.013680 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.013939 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.013970 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.013984 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.014253 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.014329 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.016006 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.016082 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.016103 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:01 crc kubenswrapper[4752]: E0227 17:35:01.029909 4752 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.102:6443: connect: connection refused" interval="400ms" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.056334 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.056377 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.056397 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.056419 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.056443 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.056520 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.056547 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.056603 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.056677 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.056723 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.056757 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.056790 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.056860 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.056892 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.056935 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.082447 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.084097 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.084180 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.084200 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.084241 4752 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 17:35:01 crc kubenswrapper[4752]: E0227 17:35:01.084906 4752 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.102:6443: connect: connection refused" node="crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.158520 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.158581 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.158622 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.158659 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.158698 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.158734 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.158769 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.158800 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.158833 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.158867 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.158927 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.158891 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.158952 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.158961 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.159013 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.159074 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.159036 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.159030 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.159097 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.159184 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.159041 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.159084 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.159112 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.159208 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.159206 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.159269 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.159351 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.159401 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.159437 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.159636 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.285192 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.287120 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.287310 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.287371 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.287431 4752 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 17:35:01 crc kubenswrapper[4752]: E0227 17:35:01.288379 4752 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.102:6443: connect: connection refused" node="crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.347587 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.359966 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.386516 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.406233 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: W0227 17:35:01.412395 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-167ad2c52bfea42fe5a0aed93bb1b00c76d572f005f3f6d3c9b996e79b0c5991 WatchSource:0}: Error finding container 167ad2c52bfea42fe5a0aed93bb1b00c76d572f005f3f6d3c9b996e79b0c5991: Status 404 returned error can't find the container with id 167ad2c52bfea42fe5a0aed93bb1b00c76d572f005f3f6d3c9b996e79b0c5991 Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.417548 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 27 17:35:01 crc kubenswrapper[4752]: W0227 17:35:01.430185 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-77f30dc8c8e40a4e7e11e759dbc0259abb07aa8fdff0a4ac02c12b112dc1fd52 WatchSource:0}: Error finding container 77f30dc8c8e40a4e7e11e759dbc0259abb07aa8fdff0a4ac02c12b112dc1fd52: Status 404 returned error can't find the container with id 77f30dc8c8e40a4e7e11e759dbc0259abb07aa8fdff0a4ac02c12b112dc1fd52 Feb 27 17:35:01 crc kubenswrapper[4752]: E0227 17:35:01.431407 4752 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.102:6443: connect: connection refused" interval="800ms" Feb 27 17:35:01 crc kubenswrapper[4752]: W0227 17:35:01.435224 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-40663eabbc28c987d655d80e354344b50c5f748d3b4aa3557947e3587e691a55 WatchSource:0}: Error finding container 40663eabbc28c987d655d80e354344b50c5f748d3b4aa3557947e3587e691a55: Status 404 returned error can't find the container with id 40663eabbc28c987d655d80e354344b50c5f748d3b4aa3557947e3587e691a55 Feb 27 17:35:01 crc kubenswrapper[4752]: W0227 17:35:01.455125 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-d3b2b01599e7b7844993d1de4f319c53fe045a5ea0a8f928108ca8517c27c668 WatchSource:0}: Error finding container d3b2b01599e7b7844993d1de4f319c53fe045a5ea0a8f928108ca8517c27c668: Status 404 returned error can't find the container with id d3b2b01599e7b7844993d1de4f319c53fe045a5ea0a8f928108ca8517c27c668 Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.689099 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.690357 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.690391 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.690402 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.690430 4752 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 17:35:01 crc kubenswrapper[4752]: E0227 17:35:01.690896 4752 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.102:6443: connect: connection refused" node="crc" Feb 27 17:35:01 crc kubenswrapper[4752]: W0227 17:35:01.755113 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.102:6443: connect: connection refused Feb 27 17:35:01 crc kubenswrapper[4752]: E0227 17:35:01.755305 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.102:6443: connect: connection refused" logger="UnhandledError" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.815434 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.102:6443: connect: connection refused Feb 27 17:35:01 crc kubenswrapper[4752]: W0227 17:35:01.875408 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.102:6443: connect: connection refused Feb 27 17:35:01 crc kubenswrapper[4752]: E0227 17:35:01.875534 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.102:6443: connect: connection refused" logger="UnhandledError" Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.913229 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"167ad2c52bfea42fe5a0aed93bb1b00c76d572f005f3f6d3c9b996e79b0c5991"} Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.915843 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d3b2b01599e7b7844993d1de4f319c53fe045a5ea0a8f928108ca8517c27c668"} Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.918483 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"40663eabbc28c987d655d80e354344b50c5f748d3b4aa3557947e3587e691a55"} Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.919659 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"77f30dc8c8e40a4e7e11e759dbc0259abb07aa8fdff0a4ac02c12b112dc1fd52"} Feb 27 17:35:01 crc kubenswrapper[4752]: I0227 17:35:01.921005 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2a9bff4bf3c508c693f68cdf8cd54ede424b4cc509d048917e79f12df417bf5a"} Feb 27 17:35:01 crc kubenswrapper[4752]: W0227 17:35:01.971476 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.102:6443: connect: connection refused Feb 27 17:35:01 crc kubenswrapper[4752]: E0227 17:35:01.971590 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.102:6443: connect: connection refused" logger="UnhandledError" Feb 27 17:35:02 crc kubenswrapper[4752]: W0227 17:35:02.127595 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.102:6443: connect: connection refused Feb 27 17:35:02 crc kubenswrapper[4752]: E0227 17:35:02.127699 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.102:6443: connect: connection refused" logger="UnhandledError" Feb 27 17:35:02 crc kubenswrapper[4752]: E0227 17:35:02.232314 4752 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.102:6443: connect: connection refused" interval="1.6s" Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.491570 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.493268 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.493338 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.493361 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.493408 4752 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 17:35:02 crc kubenswrapper[4752]: E0227 17:35:02.493965 4752 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.102:6443: connect: connection refused" node="crc" Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.807084 4752 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 27 17:35:02 crc kubenswrapper[4752]: E0227 17:35:02.808272 4752 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.102:6443: connect: connection refused" logger="UnhandledError" Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.815391 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.102:6443: connect: connection refused Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.925791 4752 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="6435db3b3bd1e40014f3a292d5e81ba3995d75800b2639c2bb3c47f640b8cd61" exitCode=0 Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.925871 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"6435db3b3bd1e40014f3a292d5e81ba3995d75800b2639c2bb3c47f640b8cd61"} Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.925989 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.927219 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.927246 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.927254 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.930935 4752 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="1978f9a878a01d29db332bc246fdf21a24d355bcd8c8580cb3a72169b68278bd" exitCode=0 Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.931065 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.931063 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"1978f9a878a01d29db332bc246fdf21a24d355bcd8c8580cb3a72169b68278bd"} Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.933552 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.933575 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.933585 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.933733 4752 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="cde8121f5098e2ad1799ff64e985d3bf8f7321b1136e669aa787d1c2970f6440" exitCode=0 Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.933820 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"cde8121f5098e2ad1799ff64e985d3bf8f7321b1136e669aa787d1c2970f6440"} Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.933852 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.934909 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.934940 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.934954 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.936227 4752 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07" exitCode=0 Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.936330 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.936327 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07"} Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.937248 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.937272 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.937281 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.939751 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fc81955110624a62223bd05aebbad30eb6a8467b33de77a99daacdd0f6ab7f06"} Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.939772 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"62ffbf0795b45df43efea5060fcbdef64063b360d6841281eb78a5179c2f62ea"} Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.939784 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"baf0bb2bec21350587defabe8172bf2a438de43ad4101a7ec15ea515e7a2624d"} Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.939794 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b07158ca3ff468fe4b8147ba5f4a6ec9e775d214521e3d5a99d9e7dde15e760a"} Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.939820 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.940488 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.940743 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.940770 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.940783 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.941416 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.941453 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:02 crc kubenswrapper[4752]: I0227 17:35:02.941467 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:03 crc kubenswrapper[4752]: E0227 17:35:03.030940 4752 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.102:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.18982af8a703b7cc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:00.812584908 +0000 UTC m=+0.719401799,LastTimestamp:2026-02-27 17:35:00.812584908 +0000 UTC m=+0.719401799,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:03 crc kubenswrapper[4752]: I0227 17:35:03.814954 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.102:6443: connect: connection refused Feb 27 17:35:03 crc kubenswrapper[4752]: E0227 17:35:03.837421 4752 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.102:6443: connect: connection refused" interval="3.2s" Feb 27 17:35:03 crc kubenswrapper[4752]: W0227 17:35:03.844919 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.102:6443: connect: connection refused Feb 27 17:35:03 crc kubenswrapper[4752]: E0227 17:35:03.845005 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.102:6443: connect: connection refused" logger="UnhandledError" Feb 27 17:35:03 crc kubenswrapper[4752]: I0227 17:35:03.944411 4752 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0dbae603425df2791994ee66c38d560123e07808f5112cdfa5b80a501cb37ecb" exitCode=0 Feb 27 17:35:03 crc kubenswrapper[4752]: I0227 17:35:03.944544 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:03 crc kubenswrapper[4752]: I0227 17:35:03.944532 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0dbae603425df2791994ee66c38d560123e07808f5112cdfa5b80a501cb37ecb"} Feb 27 17:35:03 crc kubenswrapper[4752]: I0227 17:35:03.945515 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:03 crc kubenswrapper[4752]: I0227 17:35:03.945547 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:03 crc kubenswrapper[4752]: I0227 17:35:03.945559 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:03 crc kubenswrapper[4752]: I0227 17:35:03.946992 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"0b99ad96105738ae86e79d21cbf8ce43e85d7410e44e757f70bf0ca3da252fa3"} Feb 27 17:35:03 crc kubenswrapper[4752]: I0227 17:35:03.947124 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:03 crc kubenswrapper[4752]: I0227 17:35:03.951819 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:03 crc kubenswrapper[4752]: I0227 17:35:03.951897 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:03 crc kubenswrapper[4752]: I0227 17:35:03.951916 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:03 crc kubenswrapper[4752]: I0227 17:35:03.958889 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"fa560d4b47bb3dd6066fa62fa08c6cb2e4e736e2e7be83517df2da33a3e18037"} Feb 27 17:35:03 crc kubenswrapper[4752]: I0227 17:35:03.958940 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2f34f57ccf635b4f036f46ae32672363739001a0ff840bfad06781d72976b74f"} Feb 27 17:35:03 crc kubenswrapper[4752]: I0227 17:35:03.958953 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e9103e7281307aa7f6922806e847df158e614abbd9d2cfc02b79d88d99ef6125"} Feb 27 17:35:03 crc kubenswrapper[4752]: I0227 17:35:03.959127 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:03 crc kubenswrapper[4752]: I0227 17:35:03.960708 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:03 crc kubenswrapper[4752]: I0227 17:35:03.960771 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:03 crc kubenswrapper[4752]: I0227 17:35:03.960803 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:03 crc kubenswrapper[4752]: I0227 17:35:03.973641 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:03 crc kubenswrapper[4752]: I0227 17:35:03.974358 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"90c1fe8d41bdc04d8780aca50265c0e6cf43abbb1cda58aae8f78d54d7c52d2f"} Feb 27 17:35:03 crc kubenswrapper[4752]: I0227 17:35:03.974410 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ba92a9228c8523b28a4a03f1f859339e21eac77405f37f5206e37ce4ee45d381"} Feb 27 17:35:03 crc kubenswrapper[4752]: I0227 17:35:03.974428 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f9dd98b133cbae7c1dfa12a34dd6973dd4a5fc4340d34c1c4b3296d036456a82"} Feb 27 17:35:03 crc kubenswrapper[4752]: I0227 17:35:03.974444 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"269b9e60d7588f2345dfd731d886ef405b3ccb49e23ae2e676a3fa354a0c9d70"} Feb 27 17:35:03 crc kubenswrapper[4752]: I0227 17:35:03.975388 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:03 crc kubenswrapper[4752]: I0227 17:35:03.975418 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:03 crc kubenswrapper[4752]: I0227 17:35:03.975435 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:04 crc kubenswrapper[4752]: I0227 17:35:04.094067 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:04 crc kubenswrapper[4752]: I0227 17:35:04.095164 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:04 crc kubenswrapper[4752]: I0227 17:35:04.095189 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:04 crc kubenswrapper[4752]: I0227 17:35:04.095198 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:04 crc kubenswrapper[4752]: I0227 17:35:04.095217 4752 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 17:35:04 crc kubenswrapper[4752]: E0227 17:35:04.095572 4752 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.102:6443: connect: connection refused" node="crc" Feb 27 17:35:04 crc kubenswrapper[4752]: W0227 17:35:04.183303 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.102:6443: connect: connection refused Feb 27 17:35:04 crc kubenswrapper[4752]: E0227 17:35:04.183418 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.102:6443: connect: connection refused" logger="UnhandledError" Feb 27 17:35:04 crc kubenswrapper[4752]: I0227 17:35:04.946541 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 17:35:04 crc kubenswrapper[4752]: I0227 17:35:04.979512 4752 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="3b95784babd0a3f06aa90e29e671a93c9f3fc0b7d851b4f69b23dea2bff77090" exitCode=0 Feb 27 17:35:04 crc kubenswrapper[4752]: I0227 17:35:04.979684 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:04 crc kubenswrapper[4752]: I0227 17:35:04.980268 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"3b95784babd0a3f06aa90e29e671a93c9f3fc0b7d851b4f69b23dea2bff77090"} Feb 27 17:35:04 crc kubenswrapper[4752]: I0227 17:35:04.981003 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:04 crc kubenswrapper[4752]: I0227 17:35:04.981049 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:04 crc kubenswrapper[4752]: I0227 17:35:04.981066 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:04 crc kubenswrapper[4752]: I0227 17:35:04.985052 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a296242cc32ff64b0c0f9d188a44949e85647fc0d505412834b30d141f0f2ca0"} Feb 27 17:35:04 crc kubenswrapper[4752]: I0227 17:35:04.985099 4752 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 27 17:35:04 crc kubenswrapper[4752]: I0227 17:35:04.985134 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:04 crc kubenswrapper[4752]: I0227 17:35:04.985202 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:04 crc kubenswrapper[4752]: I0227 17:35:04.985262 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:04 crc kubenswrapper[4752]: I0227 17:35:04.985410 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:04 crc kubenswrapper[4752]: I0227 17:35:04.986380 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:04 crc kubenswrapper[4752]: I0227 17:35:04.986480 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:04 crc kubenswrapper[4752]: I0227 17:35:04.986537 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:04 crc kubenswrapper[4752]: I0227 17:35:04.986897 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:04 crc kubenswrapper[4752]: I0227 17:35:04.986933 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:04 crc kubenswrapper[4752]: I0227 17:35:04.986955 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:04 crc kubenswrapper[4752]: I0227 17:35:04.986965 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:04 crc kubenswrapper[4752]: I0227 17:35:04.986994 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:04 crc kubenswrapper[4752]: I0227 17:35:04.987037 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:04 crc kubenswrapper[4752]: I0227 17:35:04.987060 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:04 crc kubenswrapper[4752]: I0227 17:35:04.987007 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:04 crc kubenswrapper[4752]: I0227 17:35:04.987119 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:05 crc kubenswrapper[4752]: I0227 17:35:05.994204 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1745cd5bb13aec093bf30d5cbfe3a1e32770fdc9b8b62d0bba6f746392f9f093"} Feb 27 17:35:05 crc kubenswrapper[4752]: I0227 17:35:05.994275 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"081e28957febf288c22fc5a14ff6072c732bb9035252e40bf7d60a7da810561c"} Feb 27 17:35:05 crc kubenswrapper[4752]: I0227 17:35:05.994298 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9eb6ebc1b83a48bc9ae778bbe3a64a5acefc3d6e7c9077021af7548bce8acd39"} Feb 27 17:35:05 crc kubenswrapper[4752]: I0227 17:35:05.994335 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:05 crc kubenswrapper[4752]: I0227 17:35:05.994464 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:35:05 crc kubenswrapper[4752]: I0227 17:35:05.995769 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:05 crc kubenswrapper[4752]: I0227 17:35:05.995834 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:05 crc kubenswrapper[4752]: I0227 17:35:05.995853 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:06 crc kubenswrapper[4752]: I0227 17:35:06.218203 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:35:06 crc kubenswrapper[4752]: I0227 17:35:06.986564 4752 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 27 17:35:06 crc kubenswrapper[4752]: I0227 17:35:06.999462 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 17:35:06 crc kubenswrapper[4752]: I0227 17:35:06.999623 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:07 crc kubenswrapper[4752]: I0227 17:35:07.000720 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:07 crc kubenswrapper[4752]: I0227 17:35:07.000751 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:07 crc kubenswrapper[4752]: I0227 17:35:07.000760 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:07 crc kubenswrapper[4752]: I0227 17:35:07.002588 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:07 crc kubenswrapper[4752]: I0227 17:35:07.003180 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:07 crc kubenswrapper[4752]: I0227 17:35:07.003202 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ff7c6d69fe9ecffb4c2945ffebd7a975ef7e9f8f6876d2286bd72207661c1ae8"} Feb 27 17:35:07 crc kubenswrapper[4752]: I0227 17:35:07.003225 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"95f09f9b4d4d8489504262ea444217edbbd8c6b1233cd290473c935cf9615b90"} Feb 27 17:35:07 crc kubenswrapper[4752]: I0227 17:35:07.003523 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:07 crc kubenswrapper[4752]: I0227 17:35:07.003564 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:07 crc kubenswrapper[4752]: I0227 17:35:07.003581 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:07 crc kubenswrapper[4752]: I0227 17:35:07.004062 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:07 crc kubenswrapper[4752]: I0227 17:35:07.004086 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:07 crc kubenswrapper[4752]: I0227 17:35:07.004096 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:07 crc kubenswrapper[4752]: I0227 17:35:07.006705 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 17:35:07 crc kubenswrapper[4752]: I0227 17:35:07.296110 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:07 crc kubenswrapper[4752]: I0227 17:35:07.297815 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:07 crc kubenswrapper[4752]: I0227 17:35:07.297866 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:07 crc kubenswrapper[4752]: I0227 17:35:07.297883 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:07 crc kubenswrapper[4752]: I0227 17:35:07.297914 4752 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 17:35:07 crc kubenswrapper[4752]: I0227 17:35:07.747116 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 27 17:35:08 crc kubenswrapper[4752]: I0227 17:35:08.005577 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:08 crc kubenswrapper[4752]: I0227 17:35:08.005693 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:08 crc kubenswrapper[4752]: I0227 17:35:08.005591 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:08 crc kubenswrapper[4752]: I0227 17:35:08.007373 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:08 crc kubenswrapper[4752]: I0227 17:35:08.007400 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:08 crc kubenswrapper[4752]: I0227 17:35:08.007408 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:08 crc kubenswrapper[4752]: I0227 17:35:08.007429 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:08 crc kubenswrapper[4752]: I0227 17:35:08.007466 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:08 crc kubenswrapper[4752]: I0227 17:35:08.007491 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:08 crc kubenswrapper[4752]: I0227 17:35:08.008097 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:08 crc kubenswrapper[4752]: I0227 17:35:08.008193 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:08 crc kubenswrapper[4752]: I0227 17:35:08.008213 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:08 crc kubenswrapper[4752]: I0227 17:35:08.921714 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 27 17:35:09 crc kubenswrapper[4752]: I0227 17:35:09.008439 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:09 crc kubenswrapper[4752]: I0227 17:35:09.009882 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:09 crc kubenswrapper[4752]: I0227 17:35:09.009941 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:09 crc kubenswrapper[4752]: I0227 17:35:09.009963 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:09 crc kubenswrapper[4752]: I0227 17:35:09.125863 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:35:09 crc kubenswrapper[4752]: I0227 17:35:09.126343 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:09 crc kubenswrapper[4752]: I0227 17:35:09.127922 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:09 crc kubenswrapper[4752]: I0227 17:35:09.127987 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:09 crc kubenswrapper[4752]: I0227 17:35:09.128006 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:10 crc kubenswrapper[4752]: I0227 17:35:10.010921 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:10 crc kubenswrapper[4752]: I0227 17:35:10.012113 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:10 crc kubenswrapper[4752]: I0227 17:35:10.012212 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:10 crc kubenswrapper[4752]: I0227 17:35:10.012236 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:10 crc kubenswrapper[4752]: I0227 17:35:10.241300 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 17:35:10 crc kubenswrapper[4752]: I0227 17:35:10.241579 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:10 crc kubenswrapper[4752]: I0227 17:35:10.243079 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:10 crc kubenswrapper[4752]: I0227 17:35:10.243133 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:10 crc kubenswrapper[4752]: I0227 17:35:10.243169 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:10 crc kubenswrapper[4752]: I0227 17:35:10.279591 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 17:35:10 crc kubenswrapper[4752]: I0227 17:35:10.279807 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:10 crc kubenswrapper[4752]: I0227 17:35:10.281393 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:10 crc kubenswrapper[4752]: I0227 17:35:10.281475 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:10 crc kubenswrapper[4752]: I0227 17:35:10.281493 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:10 crc kubenswrapper[4752]: E0227 17:35:10.997663 4752 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 17:35:12 crc kubenswrapper[4752]: I0227 17:35:12.795703 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 17:35:12 crc kubenswrapper[4752]: I0227 17:35:12.796094 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:12 crc kubenswrapper[4752]: I0227 17:35:12.797453 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:12 crc kubenswrapper[4752]: I0227 17:35:12.797569 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:12 crc kubenswrapper[4752]: I0227 17:35:12.797596 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:12 crc kubenswrapper[4752]: I0227 17:35:12.802816 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 17:35:13 crc kubenswrapper[4752]: I0227 17:35:13.018128 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:13 crc kubenswrapper[4752]: I0227 17:35:13.019490 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:13 crc kubenswrapper[4752]: I0227 17:35:13.019542 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:13 crc kubenswrapper[4752]: I0227 17:35:13.019558 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:13 crc kubenswrapper[4752]: I0227 17:35:13.280216 4752 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 27 17:35:13 crc kubenswrapper[4752]: I0227 17:35:13.280350 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 27 17:35:14 crc kubenswrapper[4752]: I0227 17:35:14.187190 4752 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 27 17:35:14 crc kubenswrapper[4752]: I0227 17:35:14.187260 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 27 17:35:14 crc kubenswrapper[4752]: I0227 17:35:14.610848 4752 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 27 17:35:14 crc kubenswrapper[4752]: I0227 17:35:14.611428 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 27 17:35:14 crc kubenswrapper[4752]: W0227 17:35:14.736236 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 27 17:35:14 crc kubenswrapper[4752]: I0227 17:35:14.736341 4752 trace.go:236] Trace[1309503480]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Feb-2026 17:35:04.734) (total time: 10001ms): Feb 27 17:35:14 crc kubenswrapper[4752]: Trace[1309503480]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (17:35:14.736) Feb 27 17:35:14 crc kubenswrapper[4752]: Trace[1309503480]: [10.00143833s] [10.00143833s] END Feb 27 17:35:14 crc kubenswrapper[4752]: E0227 17:35:14.736368 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 27 17:35:14 crc kubenswrapper[4752]: I0227 17:35:14.816755 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Feb 27 17:35:14 crc kubenswrapper[4752]: W0227 17:35:14.844684 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 27 17:35:14 crc kubenswrapper[4752]: I0227 17:35:14.845064 4752 trace.go:236] Trace[431188041]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Feb-2026 17:35:04.843) (total time: 10001ms): Feb 27 17:35:14 crc kubenswrapper[4752]: Trace[431188041]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (17:35:14.844) Feb 27 17:35:14 crc kubenswrapper[4752]: Trace[431188041]: [10.001660266s] [10.001660266s] END Feb 27 17:35:14 crc kubenswrapper[4752]: E0227 17:35:14.845367 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 27 17:35:14 crc kubenswrapper[4752]: E0227 17:35:14.988496 4752 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:14Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 17:35:14 crc kubenswrapper[4752]: E0227 17:35:14.991595 4752 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:14Z is after 2026-02-23T05:33:13Z" node="crc" Feb 27 17:35:14 crc kubenswrapper[4752]: I0227 17:35:14.992130 4752 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 27 17:35:14 crc kubenswrapper[4752]: I0227 17:35:14.992301 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 27 17:35:14 crc kubenswrapper[4752]: W0227 17:35:14.992942 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:14Z is after 2026-02-23T05:33:13Z Feb 27 17:35:14 crc kubenswrapper[4752]: E0227 17:35:14.993029 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:14Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 17:35:14 crc kubenswrapper[4752]: W0227 17:35:14.992961 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:14Z is after 2026-02-23T05:33:13Z Feb 27 17:35:14 crc kubenswrapper[4752]: E0227 17:35:14.993545 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:14Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 17:35:14 crc kubenswrapper[4752]: E0227 17:35:14.994601 4752 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:14Z is after 2026-02-23T05:33:13Z" interval="6.4s" Feb 27 17:35:14 crc kubenswrapper[4752]: E0227 17:35:14.997076 4752 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:14Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.18982af8a703b7cc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:00.812584908 +0000 UTC m=+0.719401799,LastTimestamp:2026-02-27 17:35:00.812584908 +0000 UTC m=+0.719401799,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:15 crc kubenswrapper[4752]: I0227 17:35:15.000484 4752 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 27 17:35:15 crc kubenswrapper[4752]: I0227 17:35:15.000736 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 27 17:35:15 crc kubenswrapper[4752]: I0227 17:35:15.820815 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:15Z is after 2026-02-23T05:33:13Z Feb 27 17:35:16 crc kubenswrapper[4752]: I0227 17:35:16.029198 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 27 17:35:16 crc kubenswrapper[4752]: I0227 17:35:16.031659 4752 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a296242cc32ff64b0c0f9d188a44949e85647fc0d505412834b30d141f0f2ca0" exitCode=255 Feb 27 17:35:16 crc kubenswrapper[4752]: I0227 17:35:16.031705 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a296242cc32ff64b0c0f9d188a44949e85647fc0d505412834b30d141f0f2ca0"} Feb 27 17:35:16 crc kubenswrapper[4752]: I0227 17:35:16.031909 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:16 crc kubenswrapper[4752]: I0227 17:35:16.032790 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:16 crc kubenswrapper[4752]: I0227 17:35:16.032844 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:16 crc kubenswrapper[4752]: I0227 17:35:16.032862 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:16 crc kubenswrapper[4752]: I0227 17:35:16.033578 4752 scope.go:117] "RemoveContainer" containerID="a296242cc32ff64b0c0f9d188a44949e85647fc0d505412834b30d141f0f2ca0" Feb 27 17:35:16 crc kubenswrapper[4752]: I0227 17:35:16.222424 4752 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 27 17:35:16 crc kubenswrapper[4752]: [+]log ok Feb 27 17:35:16 crc kubenswrapper[4752]: [+]etcd ok Feb 27 17:35:16 crc kubenswrapper[4752]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Feb 27 17:35:16 crc kubenswrapper[4752]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 27 17:35:16 crc kubenswrapper[4752]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 27 17:35:16 crc kubenswrapper[4752]: [+]poststarthook/openshift.io-api-request-count-filter ok Feb 27 17:35:16 crc kubenswrapper[4752]: [+]poststarthook/openshift.io-startkubeinformers ok Feb 27 17:35:16 crc kubenswrapper[4752]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Feb 27 17:35:16 crc kubenswrapper[4752]: [+]poststarthook/generic-apiserver-start-informers ok Feb 27 17:35:16 crc kubenswrapper[4752]: [+]poststarthook/priority-and-fairness-config-consumer ok Feb 27 17:35:16 crc kubenswrapper[4752]: [+]poststarthook/priority-and-fairness-filter ok Feb 27 17:35:16 crc kubenswrapper[4752]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 27 17:35:16 crc kubenswrapper[4752]: [+]poststarthook/start-apiextensions-informers ok Feb 27 17:35:16 crc kubenswrapper[4752]: [+]poststarthook/start-apiextensions-controllers ok Feb 27 17:35:16 crc kubenswrapper[4752]: [+]poststarthook/crd-informer-synced ok Feb 27 17:35:16 crc kubenswrapper[4752]: [+]poststarthook/start-system-namespaces-controller ok Feb 27 17:35:16 crc kubenswrapper[4752]: [+]poststarthook/start-cluster-authentication-info-controller ok Feb 27 17:35:16 crc kubenswrapper[4752]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Feb 27 17:35:16 crc kubenswrapper[4752]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Feb 27 17:35:16 crc kubenswrapper[4752]: [+]poststarthook/start-legacy-token-tracking-controller ok Feb 27 17:35:16 crc kubenswrapper[4752]: [+]poststarthook/start-service-ip-repair-controllers ok Feb 27 17:35:16 crc kubenswrapper[4752]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Feb 27 17:35:16 crc kubenswrapper[4752]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Feb 27 17:35:16 crc kubenswrapper[4752]: [+]poststarthook/priority-and-fairness-config-producer ok Feb 27 17:35:16 crc kubenswrapper[4752]: [+]poststarthook/bootstrap-controller ok Feb 27 17:35:16 crc kubenswrapper[4752]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Feb 27 17:35:16 crc kubenswrapper[4752]: [+]poststarthook/start-kube-aggregator-informers ok Feb 27 17:35:16 crc kubenswrapper[4752]: [+]poststarthook/apiservice-status-local-available-controller ok Feb 27 17:35:16 crc kubenswrapper[4752]: [+]poststarthook/apiservice-status-remote-available-controller ok Feb 27 17:35:16 crc kubenswrapper[4752]: [+]poststarthook/apiservice-registration-controller ok Feb 27 17:35:16 crc kubenswrapper[4752]: [+]poststarthook/apiservice-wait-for-first-sync ok Feb 27 17:35:16 crc kubenswrapper[4752]: [+]poststarthook/apiservice-discovery-controller ok Feb 27 17:35:16 crc kubenswrapper[4752]: [+]poststarthook/kube-apiserver-autoregistration ok Feb 27 17:35:16 crc kubenswrapper[4752]: [+]autoregister-completion ok Feb 27 17:35:16 crc kubenswrapper[4752]: [+]poststarthook/apiservice-openapi-controller ok Feb 27 17:35:16 crc kubenswrapper[4752]: [+]poststarthook/apiservice-openapiv3-controller ok Feb 27 17:35:16 crc kubenswrapper[4752]: livez check failed Feb 27 17:35:16 crc kubenswrapper[4752]: I0227 17:35:16.222512 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 17:35:16 crc kubenswrapper[4752]: I0227 17:35:16.818946 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:16Z is after 2026-02-23T05:33:13Z Feb 27 17:35:17 crc kubenswrapper[4752]: I0227 17:35:17.038850 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 27 17:35:17 crc kubenswrapper[4752]: I0227 17:35:17.041567 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"aa0b7f18e358d9c9ce0fe9489ae0ed67ba45ea07fd93a1f437e09bc27d001034"} Feb 27 17:35:17 crc kubenswrapper[4752]: I0227 17:35:17.041803 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:17 crc kubenswrapper[4752]: I0227 17:35:17.043085 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:17 crc kubenswrapper[4752]: I0227 17:35:17.043126 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:17 crc kubenswrapper[4752]: I0227 17:35:17.043167 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:17 crc kubenswrapper[4752]: I0227 17:35:17.820316 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:17Z is after 2026-02-23T05:33:13Z Feb 27 17:35:18 crc kubenswrapper[4752]: I0227 17:35:18.048059 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 27 17:35:18 crc kubenswrapper[4752]: I0227 17:35:18.048998 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 27 17:35:18 crc kubenswrapper[4752]: I0227 17:35:18.052131 4752 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="aa0b7f18e358d9c9ce0fe9489ae0ed67ba45ea07fd93a1f437e09bc27d001034" exitCode=255 Feb 27 17:35:18 crc kubenswrapper[4752]: I0227 17:35:18.052236 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"aa0b7f18e358d9c9ce0fe9489ae0ed67ba45ea07fd93a1f437e09bc27d001034"} Feb 27 17:35:18 crc kubenswrapper[4752]: I0227 17:35:18.052295 4752 scope.go:117] "RemoveContainer" containerID="a296242cc32ff64b0c0f9d188a44949e85647fc0d505412834b30d141f0f2ca0" Feb 27 17:35:18 crc kubenswrapper[4752]: I0227 17:35:18.052573 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:18 crc kubenswrapper[4752]: I0227 17:35:18.054289 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:18 crc kubenswrapper[4752]: I0227 17:35:18.054328 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:18 crc kubenswrapper[4752]: I0227 17:35:18.054350 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:18 crc kubenswrapper[4752]: I0227 17:35:18.055301 4752 scope.go:117] "RemoveContainer" containerID="aa0b7f18e358d9c9ce0fe9489ae0ed67ba45ea07fd93a1f437e09bc27d001034" Feb 27 17:35:18 crc kubenswrapper[4752]: E0227 17:35:18.055653 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 17:35:18 crc kubenswrapper[4752]: W0227 17:35:18.242107 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:18Z is after 2026-02-23T05:33:13Z Feb 27 17:35:18 crc kubenswrapper[4752]: E0227 17:35:18.242300 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:18Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 17:35:18 crc kubenswrapper[4752]: I0227 17:35:18.821109 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:18Z is after 2026-02-23T05:33:13Z Feb 27 17:35:18 crc kubenswrapper[4752]: I0227 17:35:18.957489 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 27 17:35:18 crc kubenswrapper[4752]: I0227 17:35:18.957741 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:18 crc kubenswrapper[4752]: I0227 17:35:18.959370 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:18 crc kubenswrapper[4752]: I0227 17:35:18.959442 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:18 crc kubenswrapper[4752]: I0227 17:35:18.959463 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:18 crc kubenswrapper[4752]: I0227 17:35:18.976322 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 27 17:35:19 crc kubenswrapper[4752]: I0227 17:35:19.058467 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 27 17:35:19 crc kubenswrapper[4752]: I0227 17:35:19.064056 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:19 crc kubenswrapper[4752]: I0227 17:35:19.065094 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:19 crc kubenswrapper[4752]: I0227 17:35:19.065185 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:19 crc kubenswrapper[4752]: I0227 17:35:19.065205 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:19 crc kubenswrapper[4752]: W0227 17:35:19.207840 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:19Z is after 2026-02-23T05:33:13Z Feb 27 17:35:19 crc kubenswrapper[4752]: E0227 17:35:19.207954 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:19Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 17:35:19 crc kubenswrapper[4752]: I0227 17:35:19.821182 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:19Z is after 2026-02-23T05:33:13Z Feb 27 17:35:20 crc kubenswrapper[4752]: I0227 17:35:20.820607 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:20Z is after 2026-02-23T05:33:13Z Feb 27 17:35:20 crc kubenswrapper[4752]: E0227 17:35:20.998594 4752 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 17:35:21 crc kubenswrapper[4752]: I0227 17:35:21.232054 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:35:21 crc kubenswrapper[4752]: I0227 17:35:21.232336 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:21 crc kubenswrapper[4752]: I0227 17:35:21.234242 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:21 crc kubenswrapper[4752]: I0227 17:35:21.234305 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:21 crc kubenswrapper[4752]: I0227 17:35:21.234328 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:21 crc kubenswrapper[4752]: I0227 17:35:21.235262 4752 scope.go:117] "RemoveContainer" containerID="aa0b7f18e358d9c9ce0fe9489ae0ed67ba45ea07fd93a1f437e09bc27d001034" Feb 27 17:35:21 crc kubenswrapper[4752]: E0227 17:35:21.235547 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 17:35:21 crc kubenswrapper[4752]: I0227 17:35:21.240344 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:35:21 crc kubenswrapper[4752]: I0227 17:35:21.392471 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:21 crc kubenswrapper[4752]: I0227 17:35:21.393778 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:21 crc kubenswrapper[4752]: I0227 17:35:21.393817 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:21 crc kubenswrapper[4752]: I0227 17:35:21.393832 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:21 crc kubenswrapper[4752]: I0227 17:35:21.393858 4752 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 17:35:21 crc kubenswrapper[4752]: E0227 17:35:21.398748 4752 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:21Z is after 2026-02-23T05:33:13Z" node="crc" Feb 27 17:35:21 crc kubenswrapper[4752]: E0227 17:35:21.400818 4752 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:21Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 27 17:35:21 crc kubenswrapper[4752]: W0227 17:35:21.567013 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:21Z is after 2026-02-23T05:33:13Z Feb 27 17:35:21 crc kubenswrapper[4752]: E0227 17:35:21.567116 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:21Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 17:35:21 crc kubenswrapper[4752]: I0227 17:35:21.819710 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:21Z is after 2026-02-23T05:33:13Z Feb 27 17:35:22 crc kubenswrapper[4752]: I0227 17:35:22.073468 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:22 crc kubenswrapper[4752]: I0227 17:35:22.074899 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:22 crc kubenswrapper[4752]: I0227 17:35:22.074987 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:22 crc kubenswrapper[4752]: I0227 17:35:22.075007 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:22 crc kubenswrapper[4752]: I0227 17:35:22.075781 4752 scope.go:117] "RemoveContainer" containerID="aa0b7f18e358d9c9ce0fe9489ae0ed67ba45ea07fd93a1f437e09bc27d001034" Feb 27 17:35:22 crc kubenswrapper[4752]: E0227 17:35:22.076063 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 17:35:22 crc kubenswrapper[4752]: I0227 17:35:22.818409 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:22Z is after 2026-02-23T05:33:13Z Feb 27 17:35:23 crc kubenswrapper[4752]: I0227 17:35:23.280885 4752 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 27 17:35:23 crc kubenswrapper[4752]: I0227 17:35:23.280984 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 27 17:35:23 crc kubenswrapper[4752]: I0227 17:35:23.717762 4752 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 27 17:35:23 crc kubenswrapper[4752]: E0227 17:35:23.723557 4752 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:23Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 17:35:23 crc kubenswrapper[4752]: I0227 17:35:23.819965 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:23Z is after 2026-02-23T05:33:13Z Feb 27 17:35:24 crc kubenswrapper[4752]: I0227 17:35:24.186995 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:35:24 crc kubenswrapper[4752]: I0227 17:35:24.187979 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:24 crc kubenswrapper[4752]: I0227 17:35:24.194085 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:24 crc kubenswrapper[4752]: I0227 17:35:24.194357 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:24 crc kubenswrapper[4752]: I0227 17:35:24.194521 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:24 crc kubenswrapper[4752]: I0227 17:35:24.195579 4752 scope.go:117] "RemoveContainer" containerID="aa0b7f18e358d9c9ce0fe9489ae0ed67ba45ea07fd93a1f437e09bc27d001034" Feb 27 17:35:24 crc kubenswrapper[4752]: E0227 17:35:24.196098 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 17:35:24 crc kubenswrapper[4752]: I0227 17:35:24.609619 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:35:24 crc kubenswrapper[4752]: I0227 17:35:24.820399 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:24Z is after 2026-02-23T05:33:13Z Feb 27 17:35:25 crc kubenswrapper[4752]: E0227 17:35:25.003259 4752 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:25Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.18982af8a703b7cc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:00.812584908 +0000 UTC m=+0.719401799,LastTimestamp:2026-02-27 17:35:00.812584908 +0000 UTC m=+0.719401799,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:25 crc kubenswrapper[4752]: I0227 17:35:25.080727 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:25 crc kubenswrapper[4752]: I0227 17:35:25.082110 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:25 crc kubenswrapper[4752]: I0227 17:35:25.082191 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:25 crc kubenswrapper[4752]: I0227 17:35:25.082211 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:25 crc kubenswrapper[4752]: I0227 17:35:25.082980 4752 scope.go:117] "RemoveContainer" containerID="aa0b7f18e358d9c9ce0fe9489ae0ed67ba45ea07fd93a1f437e09bc27d001034" Feb 27 17:35:25 crc kubenswrapper[4752]: E0227 17:35:25.083282 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 17:35:25 crc kubenswrapper[4752]: I0227 17:35:25.820516 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:25Z is after 2026-02-23T05:33:13Z Feb 27 17:35:26 crc kubenswrapper[4752]: W0227 17:35:26.430428 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:26Z is after 2026-02-23T05:33:13Z Feb 27 17:35:26 crc kubenswrapper[4752]: E0227 17:35:26.430508 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:26Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 17:35:26 crc kubenswrapper[4752]: W0227 17:35:26.568924 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:26Z is after 2026-02-23T05:33:13Z Feb 27 17:35:26 crc kubenswrapper[4752]: E0227 17:35:26.569012 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:26Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 17:35:26 crc kubenswrapper[4752]: I0227 17:35:26.818999 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:26Z is after 2026-02-23T05:33:13Z Feb 27 17:35:27 crc kubenswrapper[4752]: I0227 17:35:27.819879 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:27Z is after 2026-02-23T05:33:13Z Feb 27 17:35:28 crc kubenswrapper[4752]: I0227 17:35:28.399216 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:28 crc kubenswrapper[4752]: I0227 17:35:28.400776 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:28 crc kubenswrapper[4752]: I0227 17:35:28.400819 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:28 crc kubenswrapper[4752]: I0227 17:35:28.400837 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:28 crc kubenswrapper[4752]: I0227 17:35:28.400869 4752 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 17:35:28 crc kubenswrapper[4752]: E0227 17:35:28.405718 4752 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:28Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 27 17:35:28 crc kubenswrapper[4752]: E0227 17:35:28.406206 4752 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:28Z is after 2026-02-23T05:33:13Z" node="crc" Feb 27 17:35:28 crc kubenswrapper[4752]: I0227 17:35:28.820601 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:28Z is after 2026-02-23T05:33:13Z Feb 27 17:35:29 crc kubenswrapper[4752]: W0227 17:35:29.727446 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:29Z is after 2026-02-23T05:33:13Z Feb 27 17:35:29 crc kubenswrapper[4752]: E0227 17:35:29.727543 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:29Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 17:35:29 crc kubenswrapper[4752]: I0227 17:35:29.817576 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:29Z is after 2026-02-23T05:33:13Z Feb 27 17:35:30 crc kubenswrapper[4752]: I0227 17:35:30.820353 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:30Z is after 2026-02-23T05:33:13Z Feb 27 17:35:30 crc kubenswrapper[4752]: E0227 17:35:30.999782 4752 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 17:35:31 crc kubenswrapper[4752]: I0227 17:35:31.820071 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:31Z is after 2026-02-23T05:33:13Z Feb 27 17:35:32 crc kubenswrapper[4752]: I0227 17:35:32.820590 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:32Z is after 2026-02-23T05:33:13Z Feb 27 17:35:33 crc kubenswrapper[4752]: I0227 17:35:33.280921 4752 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 27 17:35:33 crc kubenswrapper[4752]: I0227 17:35:33.281040 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 27 17:35:33 crc kubenswrapper[4752]: I0227 17:35:33.281118 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 17:35:33 crc kubenswrapper[4752]: I0227 17:35:33.281331 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:33 crc kubenswrapper[4752]: I0227 17:35:33.282752 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:33 crc kubenswrapper[4752]: I0227 17:35:33.282809 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:33 crc kubenswrapper[4752]: I0227 17:35:33.282831 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:33 crc kubenswrapper[4752]: I0227 17:35:33.283528 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"baf0bb2bec21350587defabe8172bf2a438de43ad4101a7ec15ea515e7a2624d"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Feb 27 17:35:33 crc kubenswrapper[4752]: I0227 17:35:33.283768 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://baf0bb2bec21350587defabe8172bf2a438de43ad4101a7ec15ea515e7a2624d" gracePeriod=30 Feb 27 17:35:33 crc kubenswrapper[4752]: I0227 17:35:33.818073 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:33Z is after 2026-02-23T05:33:13Z Feb 27 17:35:34 crc kubenswrapper[4752]: I0227 17:35:34.109509 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 27 17:35:34 crc kubenswrapper[4752]: I0227 17:35:34.109971 4752 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="baf0bb2bec21350587defabe8172bf2a438de43ad4101a7ec15ea515e7a2624d" exitCode=255 Feb 27 17:35:34 crc kubenswrapper[4752]: I0227 17:35:34.110014 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"baf0bb2bec21350587defabe8172bf2a438de43ad4101a7ec15ea515e7a2624d"} Feb 27 17:35:34 crc kubenswrapper[4752]: I0227 17:35:34.110048 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"36a90f4c7c18fe68ad4aabb794bb89a722ba0e79f0d577e43cd4783b88781de7"} Feb 27 17:35:34 crc kubenswrapper[4752]: I0227 17:35:34.110300 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:34 crc kubenswrapper[4752]: I0227 17:35:34.111375 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:34 crc kubenswrapper[4752]: I0227 17:35:34.111422 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:34 crc kubenswrapper[4752]: I0227 17:35:34.111441 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:34 crc kubenswrapper[4752]: I0227 17:35:34.820771 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:34Z is after 2026-02-23T05:33:13Z Feb 27 17:35:35 crc kubenswrapper[4752]: E0227 17:35:35.008751 4752 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:35Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.18982af8a703b7cc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:00.812584908 +0000 UTC m=+0.719401799,LastTimestamp:2026-02-27 17:35:00.812584908 +0000 UTC m=+0.719401799,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:35 crc kubenswrapper[4752]: I0227 17:35:35.112650 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:35 crc kubenswrapper[4752]: I0227 17:35:35.113929 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:35 crc kubenswrapper[4752]: I0227 17:35:35.113993 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:35 crc kubenswrapper[4752]: I0227 17:35:35.114016 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:35 crc kubenswrapper[4752]: I0227 17:35:35.406406 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:35 crc kubenswrapper[4752]: I0227 17:35:35.408035 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:35 crc kubenswrapper[4752]: I0227 17:35:35.408088 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:35 crc kubenswrapper[4752]: I0227 17:35:35.408185 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:35 crc kubenswrapper[4752]: I0227 17:35:35.408232 4752 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 17:35:35 crc kubenswrapper[4752]: E0227 17:35:35.410820 4752 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:35Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 27 17:35:35 crc kubenswrapper[4752]: E0227 17:35:35.414353 4752 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:35Z is after 2026-02-23T05:33:13Z" node="crc" Feb 27 17:35:35 crc kubenswrapper[4752]: I0227 17:35:35.820388 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:35Z is after 2026-02-23T05:33:13Z Feb 27 17:35:36 crc kubenswrapper[4752]: I0227 17:35:36.820664 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:36Z is after 2026-02-23T05:33:13Z Feb 27 17:35:37 crc kubenswrapper[4752]: I0227 17:35:37.820213 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:37Z is after 2026-02-23T05:33:13Z Feb 27 17:35:38 crc kubenswrapper[4752]: I0227 17:35:38.819542 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:38Z is after 2026-02-23T05:33:13Z Feb 27 17:35:38 crc kubenswrapper[4752]: I0227 17:35:38.906398 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:38 crc kubenswrapper[4752]: I0227 17:35:38.907647 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:38 crc kubenswrapper[4752]: I0227 17:35:38.907670 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:38 crc kubenswrapper[4752]: I0227 17:35:38.907678 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:38 crc kubenswrapper[4752]: I0227 17:35:38.908096 4752 scope.go:117] "RemoveContainer" containerID="aa0b7f18e358d9c9ce0fe9489ae0ed67ba45ea07fd93a1f437e09bc27d001034" Feb 27 17:35:39 crc kubenswrapper[4752]: I0227 17:35:39.820010 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:39Z is after 2026-02-23T05:33:13Z Feb 27 17:35:40 crc kubenswrapper[4752]: I0227 17:35:40.131297 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 27 17:35:40 crc kubenswrapper[4752]: I0227 17:35:40.132193 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 27 17:35:40 crc kubenswrapper[4752]: I0227 17:35:40.134716 4752 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f2e648140231fd27b564a745e96076d9137e89a3603d990dd0370e47a1f2c846" exitCode=255 Feb 27 17:35:40 crc kubenswrapper[4752]: I0227 17:35:40.134773 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f2e648140231fd27b564a745e96076d9137e89a3603d990dd0370e47a1f2c846"} Feb 27 17:35:40 crc kubenswrapper[4752]: I0227 17:35:40.134830 4752 scope.go:117] "RemoveContainer" containerID="aa0b7f18e358d9c9ce0fe9489ae0ed67ba45ea07fd93a1f437e09bc27d001034" Feb 27 17:35:40 crc kubenswrapper[4752]: I0227 17:35:40.135000 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:40 crc kubenswrapper[4752]: I0227 17:35:40.136231 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:40 crc kubenswrapper[4752]: I0227 17:35:40.136273 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:40 crc kubenswrapper[4752]: I0227 17:35:40.136290 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:40 crc kubenswrapper[4752]: I0227 17:35:40.137164 4752 scope.go:117] "RemoveContainer" containerID="f2e648140231fd27b564a745e96076d9137e89a3603d990dd0370e47a1f2c846" Feb 27 17:35:40 crc kubenswrapper[4752]: E0227 17:35:40.139973 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 17:35:40 crc kubenswrapper[4752]: I0227 17:35:40.279933 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 17:35:40 crc kubenswrapper[4752]: I0227 17:35:40.280195 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:40 crc kubenswrapper[4752]: I0227 17:35:40.281485 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:40 crc kubenswrapper[4752]: I0227 17:35:40.281544 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:40 crc kubenswrapper[4752]: I0227 17:35:40.281565 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:40 crc kubenswrapper[4752]: W0227 17:35:40.562037 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:40Z is after 2026-02-23T05:33:13Z Feb 27 17:35:40 crc kubenswrapper[4752]: E0227 17:35:40.562136 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:40Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 17:35:40 crc kubenswrapper[4752]: I0227 17:35:40.819683 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:40Z is after 2026-02-23T05:33:13Z Feb 27 17:35:41 crc kubenswrapper[4752]: E0227 17:35:41.000707 4752 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 17:35:41 crc kubenswrapper[4752]: I0227 17:35:41.068208 4752 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 27 17:35:41 crc kubenswrapper[4752]: E0227 17:35:41.075491 4752 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:41Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 17:35:41 crc kubenswrapper[4752]: E0227 17:35:41.076748 4752 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Feb 27 17:35:41 crc kubenswrapper[4752]: I0227 17:35:41.141342 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 27 17:35:41 crc kubenswrapper[4752]: I0227 17:35:41.819290 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:41Z is after 2026-02-23T05:33:13Z Feb 27 17:35:42 crc kubenswrapper[4752]: I0227 17:35:42.415266 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:42 crc kubenswrapper[4752]: E0227 17:35:42.416758 4752 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:42Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 27 17:35:42 crc kubenswrapper[4752]: I0227 17:35:42.417634 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:42 crc kubenswrapper[4752]: I0227 17:35:42.417723 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:42 crc kubenswrapper[4752]: I0227 17:35:42.417776 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:42 crc kubenswrapper[4752]: I0227 17:35:42.417809 4752 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 17:35:42 crc kubenswrapper[4752]: E0227 17:35:42.423612 4752 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:42Z is after 2026-02-23T05:33:13Z" node="crc" Feb 27 17:35:42 crc kubenswrapper[4752]: W0227 17:35:42.556604 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:42Z is after 2026-02-23T05:33:13Z Feb 27 17:35:42 crc kubenswrapper[4752]: E0227 17:35:42.556707 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:42Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 17:35:42 crc kubenswrapper[4752]: I0227 17:35:42.796000 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 17:35:42 crc kubenswrapper[4752]: I0227 17:35:42.796267 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:42 crc kubenswrapper[4752]: I0227 17:35:42.797768 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:42 crc kubenswrapper[4752]: I0227 17:35:42.797849 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:42 crc kubenswrapper[4752]: I0227 17:35:42.797943 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:42 crc kubenswrapper[4752]: I0227 17:35:42.820326 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:42Z is after 2026-02-23T05:33:13Z Feb 27 17:35:43 crc kubenswrapper[4752]: I0227 17:35:43.280379 4752 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 27 17:35:43 crc kubenswrapper[4752]: I0227 17:35:43.280584 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 27 17:35:43 crc kubenswrapper[4752]: W0227 17:35:43.438330 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:43Z is after 2026-02-23T05:33:13Z Feb 27 17:35:43 crc kubenswrapper[4752]: E0227 17:35:43.438449 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:43Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 17:35:43 crc kubenswrapper[4752]: I0227 17:35:43.820462 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:43Z is after 2026-02-23T05:33:13Z Feb 27 17:35:44 crc kubenswrapper[4752]: I0227 17:35:44.186708 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:35:44 crc kubenswrapper[4752]: I0227 17:35:44.186893 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:44 crc kubenswrapper[4752]: I0227 17:35:44.188313 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:44 crc kubenswrapper[4752]: I0227 17:35:44.188474 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:44 crc kubenswrapper[4752]: I0227 17:35:44.188570 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:44 crc kubenswrapper[4752]: I0227 17:35:44.189595 4752 scope.go:117] "RemoveContainer" containerID="f2e648140231fd27b564a745e96076d9137e89a3603d990dd0370e47a1f2c846" Feb 27 17:35:44 crc kubenswrapper[4752]: E0227 17:35:44.189962 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 17:35:44 crc kubenswrapper[4752]: I0227 17:35:44.610321 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:35:44 crc kubenswrapper[4752]: W0227 17:35:44.650431 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:44Z is after 2026-02-23T05:33:13Z Feb 27 17:35:44 crc kubenswrapper[4752]: E0227 17:35:44.650543 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:44Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 27 17:35:44 crc kubenswrapper[4752]: I0227 17:35:44.822394 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:44Z is after 2026-02-23T05:33:13Z Feb 27 17:35:45 crc kubenswrapper[4752]: E0227 17:35:45.015439 4752 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:45Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.18982af8a703b7cc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:00.812584908 +0000 UTC m=+0.719401799,LastTimestamp:2026-02-27 17:35:00.812584908 +0000 UTC m=+0.719401799,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:45 crc kubenswrapper[4752]: I0227 17:35:45.155493 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:45 crc kubenswrapper[4752]: I0227 17:35:45.157224 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:45 crc kubenswrapper[4752]: I0227 17:35:45.157524 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:45 crc kubenswrapper[4752]: I0227 17:35:45.157729 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:45 crc kubenswrapper[4752]: I0227 17:35:45.158865 4752 scope.go:117] "RemoveContainer" containerID="f2e648140231fd27b564a745e96076d9137e89a3603d990dd0370e47a1f2c846" Feb 27 17:35:45 crc kubenswrapper[4752]: E0227 17:35:45.159319 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 17:35:45 crc kubenswrapper[4752]: I0227 17:35:45.822791 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 17:35:46 crc kubenswrapper[4752]: I0227 17:35:46.821548 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 17:35:47 crc kubenswrapper[4752]: I0227 17:35:47.821847 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 17:35:48 crc kubenswrapper[4752]: I0227 17:35:48.823555 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 17:35:49 crc kubenswrapper[4752]: I0227 17:35:49.423763 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:49 crc kubenswrapper[4752]: E0227 17:35:49.424419 4752 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 27 17:35:49 crc kubenswrapper[4752]: I0227 17:35:49.425722 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:49 crc kubenswrapper[4752]: I0227 17:35:49.425772 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:49 crc kubenswrapper[4752]: I0227 17:35:49.425785 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:49 crc kubenswrapper[4752]: I0227 17:35:49.425821 4752 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 17:35:49 crc kubenswrapper[4752]: E0227 17:35:49.433014 4752 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 27 17:35:49 crc kubenswrapper[4752]: I0227 17:35:49.822697 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 17:35:50 crc kubenswrapper[4752]: I0227 17:35:50.251671 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 27 17:35:50 crc kubenswrapper[4752]: I0227 17:35:50.251937 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:50 crc kubenswrapper[4752]: I0227 17:35:50.254135 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:50 crc kubenswrapper[4752]: I0227 17:35:50.254269 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:50 crc kubenswrapper[4752]: I0227 17:35:50.254299 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:50 crc kubenswrapper[4752]: I0227 17:35:50.824177 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 17:35:51 crc kubenswrapper[4752]: E0227 17:35:51.000851 4752 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 17:35:51 crc kubenswrapper[4752]: I0227 17:35:51.823462 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 17:35:52 crc kubenswrapper[4752]: I0227 17:35:52.822575 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 17:35:53 crc kubenswrapper[4752]: I0227 17:35:53.281243 4752 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 27 17:35:53 crc kubenswrapper[4752]: I0227 17:35:53.281373 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 27 17:35:53 crc kubenswrapper[4752]: I0227 17:35:53.822315 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 17:35:54 crc kubenswrapper[4752]: I0227 17:35:54.820991 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.027119 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18982af8a703b7cc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:00.812584908 +0000 UTC m=+0.719401799,LastTimestamp:2026-02-27 17:35:00.812584908 +0000 UTC m=+0.719401799,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.034494 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18982af8abaa053f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:00.890592575 +0000 UTC m=+0.797409486,LastTimestamp:2026-02-27 17:35:00.890592575 +0000 UTC m=+0.797409486,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.046663 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18982af8abac421f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:00.890739231 +0000 UTC m=+0.797556112,LastTimestamp:2026-02-27 17:35:00.890739231 +0000 UTC m=+0.797556112,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.054309 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18982af8abaffc62 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:00.890983522 +0000 UTC m=+0.797800403,LastTimestamp:2026-02-27 17:35:00.890983522 +0000 UTC m=+0.797800403,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.059468 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18982af8b14ec935 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:00.985276725 +0000 UTC m=+0.892093606,LastTimestamp:2026-02-27 17:35:00.985276725 +0000 UTC m=+0.892093606,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.068000 4752 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18982af8abaa053f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18982af8abaa053f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:00.890592575 +0000 UTC m=+0.797409486,LastTimestamp:2026-02-27 17:35:01.008567547 +0000 UTC m=+0.915384428,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.074012 4752 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18982af8abac421f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18982af8abac421f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:00.890739231 +0000 UTC m=+0.797556112,LastTimestamp:2026-02-27 17:35:01.008600139 +0000 UTC m=+0.915417030,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.080172 4752 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18982af8abaffc62\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18982af8abaffc62 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:00.890983522 +0000 UTC m=+0.797800403,LastTimestamp:2026-02-27 17:35:01.00861612 +0000 UTC m=+0.915433001,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.085898 4752 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18982af8abaa053f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18982af8abaa053f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:00.890592575 +0000 UTC m=+0.797409486,LastTimestamp:2026-02-27 17:35:01.009990772 +0000 UTC m=+0.916807633,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.091221 4752 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18982af8abac421f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18982af8abac421f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:00.890739231 +0000 UTC m=+0.797556112,LastTimestamp:2026-02-27 17:35:01.010008683 +0000 UTC m=+0.916825554,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.095491 4752 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18982af8abaffc62\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18982af8abaffc62 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:00.890983522 +0000 UTC m=+0.797800403,LastTimestamp:2026-02-27 17:35:01.010020593 +0000 UTC m=+0.916837454,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.100444 4752 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18982af8abaa053f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18982af8abaa053f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:00.890592575 +0000 UTC m=+0.797409486,LastTimestamp:2026-02-27 17:35:01.011085611 +0000 UTC m=+0.917902472,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.106189 4752 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18982af8abac421f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18982af8abac421f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:00.890739231 +0000 UTC m=+0.797556112,LastTimestamp:2026-02-27 17:35:01.011109352 +0000 UTC m=+0.917926213,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.111527 4752 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18982af8abaffc62\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18982af8abaffc62 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:00.890983522 +0000 UTC m=+0.797800403,LastTimestamp:2026-02-27 17:35:01.011119483 +0000 UTC m=+0.917936344,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.116679 4752 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18982af8abaa053f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18982af8abaa053f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:00.890592575 +0000 UTC m=+0.797409486,LastTimestamp:2026-02-27 17:35:01.011593324 +0000 UTC m=+0.918410215,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.121577 4752 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18982af8abac421f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18982af8abac421f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:00.890739231 +0000 UTC m=+0.797556112,LastTimestamp:2026-02-27 17:35:01.011625926 +0000 UTC m=+0.918442827,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.130621 4752 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18982af8abaffc62\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18982af8abaffc62 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:00.890983522 +0000 UTC m=+0.797800403,LastTimestamp:2026-02-27 17:35:01.011643557 +0000 UTC m=+0.918460448,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.135005 4752 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18982af8abaa053f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18982af8abaa053f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:00.890592575 +0000 UTC m=+0.797409486,LastTimestamp:2026-02-27 17:35:01.011668028 +0000 UTC m=+0.918484919,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.139798 4752 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18982af8abac421f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18982af8abac421f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:00.890739231 +0000 UTC m=+0.797556112,LastTimestamp:2026-02-27 17:35:01.011688839 +0000 UTC m=+0.918505720,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.144587 4752 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18982af8abaffc62\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18982af8abaffc62 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:00.890983522 +0000 UTC m=+0.797800403,LastTimestamp:2026-02-27 17:35:01.011706349 +0000 UTC m=+0.918523240,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.150299 4752 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18982af8abaa053f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18982af8abaa053f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:00.890592575 +0000 UTC m=+0.797409486,LastTimestamp:2026-02-27 17:35:01.012030394 +0000 UTC m=+0.918847255,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.186431 4752 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18982af8abac421f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18982af8abac421f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:00.890739231 +0000 UTC m=+0.797556112,LastTimestamp:2026-02-27 17:35:01.012051505 +0000 UTC m=+0.918868366,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.194372 4752 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18982af8abaffc62\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18982af8abaffc62 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:00.890983522 +0000 UTC m=+0.797800403,LastTimestamp:2026-02-27 17:35:01.012061205 +0000 UTC m=+0.918878066,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.198648 4752 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18982af8abaa053f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18982af8abaa053f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:00.890592575 +0000 UTC m=+0.797409486,LastTimestamp:2026-02-27 17:35:01.013351384 +0000 UTC m=+0.920168275,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.203028 4752 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18982af8abaa053f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18982af8abaa053f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:00.890592575 +0000 UTC m=+0.797409486,LastTimestamp:2026-02-27 17:35:01.013378625 +0000 UTC m=+0.920195506,Count:9,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.208065 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18982af8cafcd9f8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:01.41611468 +0000 UTC m=+1.322931531,LastTimestamp:2026-02-27 17:35:01.41611468 +0000 UTC m=+1.322931531,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.212036 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18982af8cb61d1e5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:01.422731749 +0000 UTC m=+1.329548590,LastTimestamp:2026-02-27 17:35:01.422731749 +0000 UTC m=+1.329548590,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.223395 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18982af8cc20855e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:01.435229534 +0000 UTC m=+1.342046385,LastTimestamp:2026-02-27 17:35:01.435229534 +0000 UTC m=+1.342046385,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.227551 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18982af8cca9557d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:01.444195709 +0000 UTC m=+1.351012600,LastTimestamp:2026-02-27 17:35:01.444195709 +0000 UTC m=+1.351012600,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.231621 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18982af8cdeec20f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:01.465522703 +0000 UTC m=+1.372339594,LastTimestamp:2026-02-27 17:35:01.465522703 +0000 UTC m=+1.372339594,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.238316 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18982af8f0da389c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:02.051379356 +0000 UTC m=+1.958196447,LastTimestamp:2026-02-27 17:35:02.051379356 +0000 UTC m=+1.958196447,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.244596 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18982af8f10bb74d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:02.054623053 +0000 UTC m=+1.961439914,LastTimestamp:2026-02-27 17:35:02.054623053 +0000 UTC m=+1.961439914,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.251076 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18982af8f136cc79 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:02.057446521 +0000 UTC m=+1.964263382,LastTimestamp:2026-02-27 17:35:02.057446521 +0000 UTC m=+1.964263382,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.258771 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18982af8f156f3b4 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:02.059553716 +0000 UTC m=+1.966370577,LastTimestamp:2026-02-27 17:35:02.059553716 +0000 UTC m=+1.966370577,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.262732 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18982af8f1a61574 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:02.0647397 +0000 UTC m=+1.971556561,LastTimestamp:2026-02-27 17:35:02.0647397 +0000 UTC m=+1.971556561,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.264457 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18982af8f20a4c85 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:02.071307397 +0000 UTC m=+1.978124258,LastTimestamp:2026-02-27 17:35:02.071307397 +0000 UTC m=+1.978124258,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.271508 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18982af8f24014cc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:02.074832076 +0000 UTC m=+1.981648967,LastTimestamp:2026-02-27 17:35:02.074832076 +0000 UTC m=+1.981648967,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.278468 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18982af8f24a17d8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:02.075488216 +0000 UTC m=+1.982305087,LastTimestamp:2026-02-27 17:35:02.075488216 +0000 UTC m=+1.982305087,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.283860 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18982af8f24bbd13 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:02.075596051 +0000 UTC m=+1.982412912,LastTimestamp:2026-02-27 17:35:02.075596051 +0000 UTC m=+1.982412912,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.290639 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18982af8f25638fc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:02.076283132 +0000 UTC m=+1.983100023,LastTimestamp:2026-02-27 17:35:02.076283132 +0000 UTC m=+1.983100023,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.295715 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18982af8f2d37791 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:02.084491153 +0000 UTC m=+1.991308014,LastTimestamp:2026-02-27 17:35:02.084491153 +0000 UTC m=+1.991308014,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.299655 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18982af9076a5622 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:02.42992285 +0000 UTC m=+2.336739731,LastTimestamp:2026-02-27 17:35:02.42992285 +0000 UTC m=+2.336739731,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.304403 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18982af9081d3df5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:02.441647605 +0000 UTC m=+2.348464496,LastTimestamp:2026-02-27 17:35:02.441647605 +0000 UTC m=+2.348464496,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.315062 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18982af90830af12 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:02.442921746 +0000 UTC m=+2.349738627,LastTimestamp:2026-02-27 17:35:02.442921746 +0000 UTC m=+2.349738627,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.319509 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18982af9147313ca openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:02.648599498 +0000 UTC m=+2.555416349,LastTimestamp:2026-02-27 17:35:02.648599498 +0000 UTC m=+2.555416349,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.325247 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18982af915343bb5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:02.661258165 +0000 UTC m=+2.568075016,LastTimestamp:2026-02-27 17:35:02.661258165 +0000 UTC m=+2.568075016,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.331492 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18982af915452472 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:02.662366322 +0000 UTC m=+2.569183173,LastTimestamp:2026-02-27 17:35:02.662366322 +0000 UTC m=+2.569183173,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.336986 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18982af91fee7f4a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:02.841237322 +0000 UTC m=+2.748054183,LastTimestamp:2026-02-27 17:35:02.841237322 +0000 UTC m=+2.748054183,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.343873 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18982af920839a10 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:02.85100904 +0000 UTC m=+2.757825891,LastTimestamp:2026-02-27 17:35:02.85100904 +0000 UTC m=+2.757825891,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.351028 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18982af92541f731 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:02.930593585 +0000 UTC m=+2.837410436,LastTimestamp:2026-02-27 17:35:02.930593585 +0000 UTC m=+2.837410436,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.356303 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18982af925840aa4 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:02.93492394 +0000 UTC m=+2.841740791,LastTimestamp:2026-02-27 17:35:02.93492394 +0000 UTC m=+2.841740791,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.363120 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18982af925d45ffc openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:02.940188668 +0000 UTC m=+2.847005529,LastTimestamp:2026-02-27 17:35:02.940188668 +0000 UTC m=+2.847005529,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.365911 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18982af925d4d39d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:02.940218269 +0000 UTC m=+2.847035120,LastTimestamp:2026-02-27 17:35:02.940218269 +0000 UTC m=+2.847035120,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.367951 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18982af9324d0cfb openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:03.149423867 +0000 UTC m=+3.056240718,LastTimestamp:2026-02-27 17:35:03.149423867 +0000 UTC m=+3.056240718,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.370948 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18982af93288fc8e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:03.153351822 +0000 UTC m=+3.060168673,LastTimestamp:2026-02-27 17:35:03.153351822 +0000 UTC m=+3.060168673,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.374460 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18982af9328c88f0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:03.153584368 +0000 UTC m=+3.060401219,LastTimestamp:2026-02-27 17:35:03.153584368 +0000 UTC m=+3.060401219,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.378409 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18982af93291e366 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:03.153935206 +0000 UTC m=+3.060752057,LastTimestamp:2026-02-27 17:35:03.153935206 +0000 UTC m=+3.060752057,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.380003 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18982af932df1e8b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:03.158996619 +0000 UTC m=+3.065813490,LastTimestamp:2026-02-27 17:35:03.158996619 +0000 UTC m=+3.065813490,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.384284 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18982af932f1f3d5 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:03.160230869 +0000 UTC m=+3.067047720,LastTimestamp:2026-02-27 17:35:03.160230869 +0000 UTC m=+3.067047720,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.388900 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18982af9346086d2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:03.184254674 +0000 UTC m=+3.091071525,LastTimestamp:2026-02-27 17:35:03.184254674 +0000 UTC m=+3.091071525,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.394341 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18982af934767109 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:03.185690889 +0000 UTC m=+3.092507750,LastTimestamp:2026-02-27 17:35:03.185690889 +0000 UTC m=+3.092507750,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.398681 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18982af934e99c90 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:03.193238672 +0000 UTC m=+3.100055543,LastTimestamp:2026-02-27 17:35:03.193238672 +0000 UTC m=+3.100055543,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.405259 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18982af934f2a0b4 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:03.193829556 +0000 UTC m=+3.100646407,LastTimestamp:2026-02-27 17:35:03.193829556 +0000 UTC m=+3.100646407,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.410136 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18982af940955fea openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:03.389044714 +0000 UTC m=+3.295861575,LastTimestamp:2026-02-27 17:35:03.389044714 +0000 UTC m=+3.295861575,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.416636 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18982af940a44d87 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:03.390023047 +0000 UTC m=+3.296839908,LastTimestamp:2026-02-27 17:35:03.390023047 +0000 UTC m=+3.296839908,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.420970 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18982af941a1566b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:03.406605931 +0000 UTC m=+3.313422782,LastTimestamp:2026-02-27 17:35:03.406605931 +0000 UTC m=+3.313422782,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.429574 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18982af941a17e61 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:03.406616161 +0000 UTC m=+3.313433042,LastTimestamp:2026-02-27 17:35:03.406616161 +0000 UTC m=+3.313433042,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.436289 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18982af941b135ae openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:03.407646126 +0000 UTC m=+3.314463007,LastTimestamp:2026-02-27 17:35:03.407646126 +0000 UTC m=+3.314463007,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.440100 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18982af941b6853e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:03.407994174 +0000 UTC m=+3.314811035,LastTimestamp:2026-02-27 17:35:03.407994174 +0000 UTC m=+3.314811035,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.445233 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18982af94f88ea25 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:03.639886373 +0000 UTC m=+3.546703224,LastTimestamp:2026-02-27 17:35:03.639886373 +0000 UTC m=+3.546703224,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.451873 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18982af94f973cd6 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:03.640825046 +0000 UTC m=+3.547641907,LastTimestamp:2026-02-27 17:35:03.640825046 +0000 UTC m=+3.547641907,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.456386 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18982af9506639fa openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:03.654390266 +0000 UTC m=+3.561207127,LastTimestamp:2026-02-27 17:35:03.654390266 +0000 UTC m=+3.561207127,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.462807 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18982af95080d4d9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:03.656133849 +0000 UTC m=+3.562950710,LastTimestamp:2026-02-27 17:35:03.656133849 +0000 UTC m=+3.562950710,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.469278 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18982af95091b1c9 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:03.657238985 +0000 UTC m=+3.564055846,LastTimestamp:2026-02-27 17:35:03.657238985 +0000 UTC m=+3.564055846,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.475828 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18982af95e702f35 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:03.889923893 +0000 UTC m=+3.796740754,LastTimestamp:2026-02-27 17:35:03.889923893 +0000 UTC m=+3.796740754,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.482187 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18982af95f32a2e5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:03.902667493 +0000 UTC m=+3.809484354,LastTimestamp:2026-02-27 17:35:03.902667493 +0000 UTC m=+3.809484354,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.489832 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18982af95f42448e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:03.903691918 +0000 UTC m=+3.810508789,LastTimestamp:2026-02-27 17:35:03.903691918 +0000 UTC m=+3.810508789,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.496494 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18982af961d8ed02 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:03.947119874 +0000 UTC m=+3.853936725,LastTimestamp:2026-02-27 17:35:03.947119874 +0000 UTC m=+3.853936725,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.504550 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18982af96b8cfb0d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:04.109914893 +0000 UTC m=+4.016731744,LastTimestamp:2026-02-27 17:35:04.109914893 +0000 UTC m=+4.016731744,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.511232 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18982af96c325c90 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:04.120753296 +0000 UTC m=+4.027570147,LastTimestamp:2026-02-27 17:35:04.120753296 +0000 UTC m=+4.027570147,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.518110 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18982af96c77e86f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:04.125311087 +0000 UTC m=+4.032127928,LastTimestamp:2026-02-27 17:35:04.125311087 +0000 UTC m=+4.032127928,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.524617 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18982af96cd505e3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:04.131413475 +0000 UTC m=+4.038230326,LastTimestamp:2026-02-27 17:35:04.131413475 +0000 UTC m=+4.038230326,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.531452 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18982af99f9c285a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:04.983324762 +0000 UTC m=+4.890141643,LastTimestamp:2026-02-27 17:35:04.983324762 +0000 UTC m=+4.890141643,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.539121 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18982af9adabcabf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:05.219230399 +0000 UTC m=+5.126047280,LastTimestamp:2026-02-27 17:35:05.219230399 +0000 UTC m=+5.126047280,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.546872 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18982af9ae9c517c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:05.234993532 +0000 UTC m=+5.141810423,LastTimestamp:2026-02-27 17:35:05.234993532 +0000 UTC m=+5.141810423,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.553341 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18982af9aeb7a07a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:05.236783226 +0000 UTC m=+5.143600117,LastTimestamp:2026-02-27 17:35:05.236783226 +0000 UTC m=+5.143600117,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.560177 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18982af9bed287ea openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:05.506981866 +0000 UTC m=+5.413798757,LastTimestamp:2026-02-27 17:35:05.506981866 +0000 UTC m=+5.413798757,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.566317 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18982af9bfcc9bcb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:05.523370955 +0000 UTC m=+5.430187836,LastTimestamp:2026-02-27 17:35:05.523370955 +0000 UTC m=+5.430187836,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.573356 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18982af9c010418e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:05.527804302 +0000 UTC m=+5.434621193,LastTimestamp:2026-02-27 17:35:05.527804302 +0000 UTC m=+5.434621193,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.577647 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18982af9d066a178 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:05.801900408 +0000 UTC m=+5.708717299,LastTimestamp:2026-02-27 17:35:05.801900408 +0000 UTC m=+5.708717299,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.583581 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18982af9d16a7eb8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:05.818930872 +0000 UTC m=+5.725747763,LastTimestamp:2026-02-27 17:35:05.818930872 +0000 UTC m=+5.725747763,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.589923 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18982af9d1848256 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:05.820635734 +0000 UTC m=+5.727452625,LastTimestamp:2026-02-27 17:35:05.820635734 +0000 UTC m=+5.727452625,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.594749 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18982af9e13a9a82 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:06.084227714 +0000 UTC m=+5.991044565,LastTimestamp:2026-02-27 17:35:06.084227714 +0000 UTC m=+5.991044565,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.599595 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18982af9e24163aa openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:06.101449642 +0000 UTC m=+6.008266493,LastTimestamp:2026-02-27 17:35:06.101449642 +0000 UTC m=+6.008266493,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.603753 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18982af9e25931c2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:06.10300973 +0000 UTC m=+6.009826611,LastTimestamp:2026-02-27 17:35:06.10300973 +0000 UTC m=+6.009826611,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.608464 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18982af9f13ce58f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:06.352813455 +0000 UTC m=+6.259630306,LastTimestamp:2026-02-27 17:35:06.352813455 +0000 UTC m=+6.259630306,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.612804 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18982af9f242fc47 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:06.369989703 +0000 UTC m=+6.276806584,LastTimestamp:2026-02-27 17:35:06.369989703 +0000 UTC m=+6.276806584,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.682416 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 27 17:35:55 crc kubenswrapper[4752]: &Event{ObjectMeta:{kube-controller-manager-crc.18982afb8e25ed04 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Feb 27 17:35:55 crc kubenswrapper[4752]: body: Feb 27 17:35:55 crc kubenswrapper[4752]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:13.280298244 +0000 UTC m=+13.187115125,LastTimestamp:2026-02-27 17:35:13.280298244 +0000 UTC m=+13.187115125,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 27 17:35:55 crc kubenswrapper[4752]: > Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.690032 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18982afb8e278e73 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:13.280405107 +0000 UTC m=+13.187222018,LastTimestamp:2026-02-27 17:35:13.280405107 +0000 UTC m=+13.187222018,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: I0227 17:35:55.884586 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.884821 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 27 17:35:55 crc kubenswrapper[4752]: &Event{ObjectMeta:{kube-apiserver-crc.18982afbc434c613 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Readiness probe error: Get "https://192.168.126.11:17697/healthz": dial tcp 192.168.126.11:17697: connect: connection refused Feb 27 17:35:55 crc kubenswrapper[4752]: body: Feb 27 17:35:55 crc kubenswrapper[4752]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:14.187240979 +0000 UTC m=+14.094057840,LastTimestamp:2026-02-27 17:35:14.187240979 +0000 UTC m=+14.094057840,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 27 17:35:55 crc kubenswrapper[4752]: > Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.887267 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18982afbc4356bb4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Readiness probe failed: Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:14.18728338 +0000 UTC m=+14.094100241,LastTimestamp:2026-02-27 17:35:14.18728338 +0000 UTC m=+14.094100241,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.890582 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 27 17:35:55 crc kubenswrapper[4752]: &Event{ObjectMeta:{kube-apiserver-crc.18982afbdd7cb5fc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Liveness probe error: Get "https://192.168.126.11:17697/healthz": dial tcp 192.168.126.11:17697: connect: connection refused Feb 27 17:35:55 crc kubenswrapper[4752]: body: Feb 27 17:35:55 crc kubenswrapper[4752]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:14.611385852 +0000 UTC m=+14.518202763,LastTimestamp:2026-02-27 17:35:14.611385852 +0000 UTC m=+14.518202763,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 27 17:35:55 crc kubenswrapper[4752]: > Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.895017 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18982afbdd7fef8d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Liveness probe failed: Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:14.611597197 +0000 UTC m=+14.518414108,LastTimestamp:2026-02-27 17:35:14.611597197 +0000 UTC m=+14.518414108,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.901549 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 27 17:35:55 crc kubenswrapper[4752]: &Event{ObjectMeta:{kube-apiserver-crc.18982afbf42fec07 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Feb 27 17:35:55 crc kubenswrapper[4752]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 27 17:35:55 crc kubenswrapper[4752]: Feb 27 17:35:55 crc kubenswrapper[4752]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:14.992229383 +0000 UTC m=+14.899046284,LastTimestamp:2026-02-27 17:35:14.992229383 +0000 UTC m=+14.899046284,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 27 17:35:55 crc kubenswrapper[4752]: > Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.907424 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18982afbf431983e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:14.992339006 +0000 UTC m=+14.899155897,LastTimestamp:2026-02-27 17:35:14.992339006 +0000 UTC m=+14.899155897,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.913326 4752 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.18982afbf42fec07\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 27 17:35:55 crc kubenswrapper[4752]: &Event{ObjectMeta:{kube-apiserver-crc.18982afbf42fec07 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Feb 27 17:35:55 crc kubenswrapper[4752]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 27 17:35:55 crc kubenswrapper[4752]: Feb 27 17:35:55 crc kubenswrapper[4752]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:14.992229383 +0000 UTC m=+14.899046284,LastTimestamp:2026-02-27 17:35:15.000695379 +0000 UTC m=+14.907512280,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 27 17:35:55 crc kubenswrapper[4752]: > Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.919371 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 27 17:35:55 crc kubenswrapper[4752]: &Event{ObjectMeta:{kube-controller-manager-crc.18982afde23be0a5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 27 17:35:55 crc kubenswrapper[4752]: body: Feb 27 17:35:55 crc kubenswrapper[4752]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:23.280957605 +0000 UTC m=+23.187774526,LastTimestamp:2026-02-27 17:35:23.280957605 +0000 UTC m=+23.187774526,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 27 17:35:55 crc kubenswrapper[4752]: > Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.926594 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18982afde23cf811 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:23.281029137 +0000 UTC m=+23.187846028,LastTimestamp:2026-02-27 17:35:23.281029137 +0000 UTC m=+23.187846028,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.933161 4752 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18982afde23be0a5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 27 17:35:55 crc kubenswrapper[4752]: &Event{ObjectMeta:{kube-controller-manager-crc.18982afde23be0a5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 27 17:35:55 crc kubenswrapper[4752]: body: Feb 27 17:35:55 crc kubenswrapper[4752]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:23.280957605 +0000 UTC m=+23.187774526,LastTimestamp:2026-02-27 17:35:33.281010241 +0000 UTC m=+33.187827122,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 27 17:35:55 crc kubenswrapper[4752]: > Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.943545 4752 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18982afde23cf811\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18982afde23cf811 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:23.281029137 +0000 UTC m=+23.187846028,LastTimestamp:2026-02-27 17:35:33.281079312 +0000 UTC m=+33.187896203,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.951086 4752 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18982b00367242c7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:33.283742407 +0000 UTC m=+33.190559288,LastTimestamp:2026-02-27 17:35:33.283742407 +0000 UTC m=+33.190559288,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.958722 4752 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18982af8f25638fc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18982af8f25638fc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:02.076283132 +0000 UTC m=+1.983100023,LastTimestamp:2026-02-27 17:35:33.403075389 +0000 UTC m=+33.309892280,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.964334 4752 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18982af9076a5622\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18982af9076a5622 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:02.42992285 +0000 UTC m=+2.336739731,LastTimestamp:2026-02-27 17:35:33.616229632 +0000 UTC m=+33.523046503,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.968420 4752 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18982af9081d3df5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18982af9081d3df5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:02.441647605 +0000 UTC m=+2.348464496,LastTimestamp:2026-02-27 17:35:33.628084751 +0000 UTC m=+33.534901642,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.984878 4752 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18982afde23be0a5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 27 17:35:55 crc kubenswrapper[4752]: &Event{ObjectMeta:{kube-controller-manager-crc.18982afde23be0a5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 27 17:35:55 crc kubenswrapper[4752]: body: Feb 27 17:35:55 crc kubenswrapper[4752]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:23.280957605 +0000 UTC m=+23.187774526,LastTimestamp:2026-02-27 17:35:43.280513964 +0000 UTC m=+43.187330845,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 27 17:35:55 crc kubenswrapper[4752]: > Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.988967 4752 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18982afde23cf811\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18982afde23cf811 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:23.281029137 +0000 UTC m=+23.187846028,LastTimestamp:2026-02-27 17:35:43.280646267 +0000 UTC m=+43.187463158,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:35:55 crc kubenswrapper[4752]: E0227 17:35:55.992784 4752 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18982afde23be0a5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 27 17:35:55 crc kubenswrapper[4752]: &Event{ObjectMeta:{kube-controller-manager-crc.18982afde23be0a5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 27 17:35:55 crc kubenswrapper[4752]: body: Feb 27 17:35:55 crc kubenswrapper[4752]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:35:23.280957605 +0000 UTC m=+23.187774526,LastTimestamp:2026-02-27 17:35:53.281341729 +0000 UTC m=+53.188158620,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 27 17:35:55 crc kubenswrapper[4752]: > Feb 27 17:35:56 crc kubenswrapper[4752]: E0227 17:35:56.432660 4752 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 27 17:35:56 crc kubenswrapper[4752]: I0227 17:35:56.433691 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:56 crc kubenswrapper[4752]: I0227 17:35:56.435042 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:56 crc kubenswrapper[4752]: I0227 17:35:56.435087 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:56 crc kubenswrapper[4752]: I0227 17:35:56.435099 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:56 crc kubenswrapper[4752]: I0227 17:35:56.435139 4752 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 17:35:56 crc kubenswrapper[4752]: E0227 17:35:56.444071 4752 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 27 17:35:56 crc kubenswrapper[4752]: I0227 17:35:56.820842 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 17:35:56 crc kubenswrapper[4752]: I0227 17:35:56.906528 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:35:56 crc kubenswrapper[4752]: I0227 17:35:56.908759 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:35:56 crc kubenswrapper[4752]: I0227 17:35:56.908917 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:35:56 crc kubenswrapper[4752]: I0227 17:35:56.909028 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:35:56 crc kubenswrapper[4752]: I0227 17:35:56.909814 4752 scope.go:117] "RemoveContainer" containerID="f2e648140231fd27b564a745e96076d9137e89a3603d990dd0370e47a1f2c846" Feb 27 17:35:56 crc kubenswrapper[4752]: E0227 17:35:56.910126 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 17:35:57 crc kubenswrapper[4752]: I0227 17:35:57.818659 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 17:35:58 crc kubenswrapper[4752]: I0227 17:35:58.818618 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 17:35:59 crc kubenswrapper[4752]: I0227 17:35:59.822768 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 17:36:00 crc kubenswrapper[4752]: I0227 17:36:00.284443 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 17:36:00 crc kubenswrapper[4752]: I0227 17:36:00.284669 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:36:00 crc kubenswrapper[4752]: I0227 17:36:00.286083 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:36:00 crc kubenswrapper[4752]: I0227 17:36:00.286165 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:36:00 crc kubenswrapper[4752]: I0227 17:36:00.286185 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:36:00 crc kubenswrapper[4752]: I0227 17:36:00.290302 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 17:36:00 crc kubenswrapper[4752]: I0227 17:36:00.822395 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 17:36:01 crc kubenswrapper[4752]: E0227 17:36:01.001095 4752 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 17:36:01 crc kubenswrapper[4752]: I0227 17:36:01.202187 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:36:01 crc kubenswrapper[4752]: I0227 17:36:01.202924 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:36:01 crc kubenswrapper[4752]: I0227 17:36:01.202955 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:36:01 crc kubenswrapper[4752]: I0227 17:36:01.202964 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:36:01 crc kubenswrapper[4752]: I0227 17:36:01.819268 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 17:36:02 crc kubenswrapper[4752]: I0227 17:36:02.822650 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 17:36:03 crc kubenswrapper[4752]: E0227 17:36:03.440533 4752 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 27 17:36:03 crc kubenswrapper[4752]: I0227 17:36:03.444572 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:36:03 crc kubenswrapper[4752]: I0227 17:36:03.446697 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:36:03 crc kubenswrapper[4752]: I0227 17:36:03.446766 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:36:03 crc kubenswrapper[4752]: I0227 17:36:03.446790 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:36:03 crc kubenswrapper[4752]: I0227 17:36:03.446845 4752 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 17:36:03 crc kubenswrapper[4752]: E0227 17:36:03.454774 4752 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 27 17:36:03 crc kubenswrapper[4752]: I0227 17:36:03.821471 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 17:36:04 crc kubenswrapper[4752]: I0227 17:36:04.822359 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 17:36:05 crc kubenswrapper[4752]: I0227 17:36:05.822650 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 17:36:06 crc kubenswrapper[4752]: I0227 17:36:06.822354 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 17:36:07 crc kubenswrapper[4752]: I0227 17:36:07.821607 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 17:36:08 crc kubenswrapper[4752]: I0227 17:36:08.822888 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 17:36:09 crc kubenswrapper[4752]: I0227 17:36:09.820745 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 17:36:09 crc kubenswrapper[4752]: I0227 17:36:09.906807 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:36:09 crc kubenswrapper[4752]: I0227 17:36:09.909097 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:36:09 crc kubenswrapper[4752]: I0227 17:36:09.909210 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:36:09 crc kubenswrapper[4752]: I0227 17:36:09.909240 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:36:09 crc kubenswrapper[4752]: I0227 17:36:09.910628 4752 scope.go:117] "RemoveContainer" containerID="f2e648140231fd27b564a745e96076d9137e89a3603d990dd0370e47a1f2c846" Feb 27 17:36:10 crc kubenswrapper[4752]: I0227 17:36:10.233904 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 27 17:36:10 crc kubenswrapper[4752]: I0227 17:36:10.236545 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5f5d9e255d12f08fb6c3de6ea1820963024e5ba5115152042c583a3fbf28b4e3"} Feb 27 17:36:10 crc kubenswrapper[4752]: I0227 17:36:10.236785 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:36:10 crc kubenswrapper[4752]: I0227 17:36:10.238598 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:36:10 crc kubenswrapper[4752]: I0227 17:36:10.238637 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:36:10 crc kubenswrapper[4752]: I0227 17:36:10.238654 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:36:10 crc kubenswrapper[4752]: E0227 17:36:10.447492 4752 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 27 17:36:10 crc kubenswrapper[4752]: I0227 17:36:10.456772 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:36:10 crc kubenswrapper[4752]: I0227 17:36:10.458517 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:36:10 crc kubenswrapper[4752]: I0227 17:36:10.458569 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:36:10 crc kubenswrapper[4752]: I0227 17:36:10.458581 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:36:10 crc kubenswrapper[4752]: I0227 17:36:10.458613 4752 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 17:36:10 crc kubenswrapper[4752]: E0227 17:36:10.464849 4752 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 27 17:36:10 crc kubenswrapper[4752]: I0227 17:36:10.818626 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 17:36:11 crc kubenswrapper[4752]: E0227 17:36:11.002011 4752 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 17:36:11 crc kubenswrapper[4752]: I0227 17:36:11.243307 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 27 17:36:11 crc kubenswrapper[4752]: I0227 17:36:11.243851 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 27 17:36:11 crc kubenswrapper[4752]: I0227 17:36:11.246555 4752 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5f5d9e255d12f08fb6c3de6ea1820963024e5ba5115152042c583a3fbf28b4e3" exitCode=255 Feb 27 17:36:11 crc kubenswrapper[4752]: I0227 17:36:11.246594 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"5f5d9e255d12f08fb6c3de6ea1820963024e5ba5115152042c583a3fbf28b4e3"} Feb 27 17:36:11 crc kubenswrapper[4752]: I0227 17:36:11.246633 4752 scope.go:117] "RemoveContainer" containerID="f2e648140231fd27b564a745e96076d9137e89a3603d990dd0370e47a1f2c846" Feb 27 17:36:11 crc kubenswrapper[4752]: I0227 17:36:11.246903 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:36:11 crc kubenswrapper[4752]: I0227 17:36:11.248403 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:36:11 crc kubenswrapper[4752]: I0227 17:36:11.248455 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:36:11 crc kubenswrapper[4752]: I0227 17:36:11.248472 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:36:11 crc kubenswrapper[4752]: I0227 17:36:11.249356 4752 scope.go:117] "RemoveContainer" containerID="5f5d9e255d12f08fb6c3de6ea1820963024e5ba5115152042c583a3fbf28b4e3" Feb 27 17:36:11 crc kubenswrapper[4752]: E0227 17:36:11.249629 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 17:36:11 crc kubenswrapper[4752]: W0227 17:36:11.776058 4752 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Feb 27 17:36:11 crc kubenswrapper[4752]: E0227 17:36:11.776138 4752 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 27 17:36:11 crc kubenswrapper[4752]: I0227 17:36:11.819536 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 17:36:12 crc kubenswrapper[4752]: I0227 17:36:12.251679 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 27 17:36:12 crc kubenswrapper[4752]: I0227 17:36:12.828484 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 17:36:13 crc kubenswrapper[4752]: I0227 17:36:13.078206 4752 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 27 17:36:13 crc kubenswrapper[4752]: I0227 17:36:13.098999 4752 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 27 17:36:13 crc kubenswrapper[4752]: I0227 17:36:13.821254 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 17:36:14 crc kubenswrapper[4752]: I0227 17:36:14.186693 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:36:14 crc kubenswrapper[4752]: I0227 17:36:14.186960 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:36:14 crc kubenswrapper[4752]: I0227 17:36:14.188613 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:36:14 crc kubenswrapper[4752]: I0227 17:36:14.188669 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:36:14 crc kubenswrapper[4752]: I0227 17:36:14.188694 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:36:14 crc kubenswrapper[4752]: I0227 17:36:14.189614 4752 scope.go:117] "RemoveContainer" containerID="5f5d9e255d12f08fb6c3de6ea1820963024e5ba5115152042c583a3fbf28b4e3" Feb 27 17:36:14 crc kubenswrapper[4752]: E0227 17:36:14.189919 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 17:36:14 crc kubenswrapper[4752]: I0227 17:36:14.609544 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:36:14 crc kubenswrapper[4752]: I0227 17:36:14.609869 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:36:14 crc kubenswrapper[4752]: I0227 17:36:14.611560 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:36:14 crc kubenswrapper[4752]: I0227 17:36:14.611625 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:36:14 crc kubenswrapper[4752]: I0227 17:36:14.611644 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:36:14 crc kubenswrapper[4752]: I0227 17:36:14.612694 4752 scope.go:117] "RemoveContainer" containerID="5f5d9e255d12f08fb6c3de6ea1820963024e5ba5115152042c583a3fbf28b4e3" Feb 27 17:36:14 crc kubenswrapper[4752]: E0227 17:36:14.612993 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 17:36:14 crc kubenswrapper[4752]: I0227 17:36:14.821548 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 17:36:15 crc kubenswrapper[4752]: I0227 17:36:15.822499 4752 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 27 17:36:16 crc kubenswrapper[4752]: I0227 17:36:16.561316 4752 csr.go:261] certificate signing request csr-mrkxx is approved, waiting to be issued Feb 27 17:36:16 crc kubenswrapper[4752]: I0227 17:36:16.574430 4752 csr.go:257] certificate signing request csr-mrkxx is issued Feb 27 17:36:16 crc kubenswrapper[4752]: I0227 17:36:16.593293 4752 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 27 17:36:16 crc kubenswrapper[4752]: I0227 17:36:16.654416 4752 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 27 17:36:17 crc kubenswrapper[4752]: I0227 17:36:17.464977 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:36:17 crc kubenswrapper[4752]: I0227 17:36:17.467005 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:36:17 crc kubenswrapper[4752]: I0227 17:36:17.467068 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:36:17 crc kubenswrapper[4752]: I0227 17:36:17.467094 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:36:17 crc kubenswrapper[4752]: I0227 17:36:17.467476 4752 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 27 17:36:17 crc kubenswrapper[4752]: I0227 17:36:17.479433 4752 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 27 17:36:17 crc kubenswrapper[4752]: I0227 17:36:17.479642 4752 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 27 17:36:17 crc kubenswrapper[4752]: E0227 17:36:17.479734 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 27 17:36:17 crc kubenswrapper[4752]: I0227 17:36:17.484788 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:36:17 crc kubenswrapper[4752]: I0227 17:36:17.484846 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:36:17 crc kubenswrapper[4752]: I0227 17:36:17.484863 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:36:17 crc kubenswrapper[4752]: I0227 17:36:17.484889 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:36:17 crc kubenswrapper[4752]: I0227 17:36:17.484921 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:36:17Z","lastTransitionTime":"2026-02-27T17:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:36:17 crc kubenswrapper[4752]: E0227 17:36:17.507385 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 17:36:17 crc kubenswrapper[4752]: I0227 17:36:17.519417 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:36:17 crc kubenswrapper[4752]: I0227 17:36:17.519503 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:36:17 crc kubenswrapper[4752]: I0227 17:36:17.519555 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:36:17 crc kubenswrapper[4752]: I0227 17:36:17.519583 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:36:17 crc kubenswrapper[4752]: I0227 17:36:17.519689 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:36:17Z","lastTransitionTime":"2026-02-27T17:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:36:17 crc kubenswrapper[4752]: E0227 17:36:17.536054 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 17:36:17 crc kubenswrapper[4752]: I0227 17:36:17.547447 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:36:17 crc kubenswrapper[4752]: I0227 17:36:17.547543 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:36:17 crc kubenswrapper[4752]: I0227 17:36:17.547570 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:36:17 crc kubenswrapper[4752]: I0227 17:36:17.547645 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:36:17 crc kubenswrapper[4752]: I0227 17:36:17.547779 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:36:17Z","lastTransitionTime":"2026-02-27T17:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:36:17 crc kubenswrapper[4752]: E0227 17:36:17.565692 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 17:36:17 crc kubenswrapper[4752]: I0227 17:36:17.575238 4752 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-09 23:05:19.137552382 +0000 UTC Feb 27 17:36:17 crc kubenswrapper[4752]: I0227 17:36:17.575448 4752 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6845h29m1.562114117s for next certificate rotation Feb 27 17:36:17 crc kubenswrapper[4752]: I0227 17:36:17.577894 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:36:17 crc kubenswrapper[4752]: I0227 17:36:17.578039 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:36:17 crc kubenswrapper[4752]: I0227 17:36:17.578070 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:36:17 crc kubenswrapper[4752]: I0227 17:36:17.578129 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:36:17 crc kubenswrapper[4752]: I0227 17:36:17.578189 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:36:17Z","lastTransitionTime":"2026-02-27T17:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:36:17 crc kubenswrapper[4752]: E0227 17:36:17.595807 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 17:36:17 crc kubenswrapper[4752]: E0227 17:36:17.596113 4752 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 17:36:17 crc kubenswrapper[4752]: E0227 17:36:17.596197 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:17 crc kubenswrapper[4752]: E0227 17:36:17.696575 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:17 crc kubenswrapper[4752]: E0227 17:36:17.796770 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:17 crc kubenswrapper[4752]: E0227 17:36:17.897768 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:17 crc kubenswrapper[4752]: E0227 17:36:17.998212 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:18 crc kubenswrapper[4752]: E0227 17:36:18.098605 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:18 crc kubenswrapper[4752]: E0227 17:36:18.199115 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:18 crc kubenswrapper[4752]: E0227 17:36:18.300060 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:18 crc kubenswrapper[4752]: E0227 17:36:18.401116 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:18 crc kubenswrapper[4752]: E0227 17:36:18.502243 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:18 crc kubenswrapper[4752]: E0227 17:36:18.603372 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:18 crc kubenswrapper[4752]: E0227 17:36:18.704024 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:18 crc kubenswrapper[4752]: E0227 17:36:18.804758 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:18 crc kubenswrapper[4752]: E0227 17:36:18.905804 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:18 crc kubenswrapper[4752]: I0227 17:36:18.905924 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:36:18 crc kubenswrapper[4752]: I0227 17:36:18.907495 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:36:18 crc kubenswrapper[4752]: I0227 17:36:18.907549 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:36:18 crc kubenswrapper[4752]: I0227 17:36:18.907567 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:36:19 crc kubenswrapper[4752]: E0227 17:36:19.006875 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:19 crc kubenswrapper[4752]: E0227 17:36:19.107936 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:19 crc kubenswrapper[4752]: E0227 17:36:19.209345 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:19 crc kubenswrapper[4752]: E0227 17:36:19.309970 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:19 crc kubenswrapper[4752]: E0227 17:36:19.410730 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:19 crc kubenswrapper[4752]: E0227 17:36:19.511299 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:19 crc kubenswrapper[4752]: E0227 17:36:19.611989 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:19 crc kubenswrapper[4752]: E0227 17:36:19.712080 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:19 crc kubenswrapper[4752]: E0227 17:36:19.813080 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:19 crc kubenswrapper[4752]: E0227 17:36:19.913564 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:20 crc kubenswrapper[4752]: E0227 17:36:20.013842 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:20 crc kubenswrapper[4752]: E0227 17:36:20.114370 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:20 crc kubenswrapper[4752]: E0227 17:36:20.215077 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:20 crc kubenswrapper[4752]: E0227 17:36:20.315549 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:20 crc kubenswrapper[4752]: E0227 17:36:20.415834 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:20 crc kubenswrapper[4752]: E0227 17:36:20.516065 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:20 crc kubenswrapper[4752]: E0227 17:36:20.616551 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:20 crc kubenswrapper[4752]: E0227 17:36:20.717514 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:20 crc kubenswrapper[4752]: E0227 17:36:20.817944 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:20 crc kubenswrapper[4752]: E0227 17:36:20.918266 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:21 crc kubenswrapper[4752]: E0227 17:36:21.002135 4752 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 17:36:21 crc kubenswrapper[4752]: E0227 17:36:21.018919 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:21 crc kubenswrapper[4752]: E0227 17:36:21.119862 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:21 crc kubenswrapper[4752]: E0227 17:36:21.220824 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:21 crc kubenswrapper[4752]: E0227 17:36:21.320960 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:21 crc kubenswrapper[4752]: E0227 17:36:21.421478 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:21 crc kubenswrapper[4752]: E0227 17:36:21.521962 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:21 crc kubenswrapper[4752]: E0227 17:36:21.622521 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:21 crc kubenswrapper[4752]: E0227 17:36:21.723566 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:21 crc kubenswrapper[4752]: E0227 17:36:21.824257 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:21 crc kubenswrapper[4752]: E0227 17:36:21.924752 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:22 crc kubenswrapper[4752]: E0227 17:36:22.025709 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:22 crc kubenswrapper[4752]: E0227 17:36:22.126352 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:22 crc kubenswrapper[4752]: E0227 17:36:22.227516 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:22 crc kubenswrapper[4752]: E0227 17:36:22.328047 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:22 crc kubenswrapper[4752]: E0227 17:36:22.429180 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:22 crc kubenswrapper[4752]: E0227 17:36:22.529421 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:22 crc kubenswrapper[4752]: I0227 17:36:22.563758 4752 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 27 17:36:22 crc kubenswrapper[4752]: E0227 17:36:22.629624 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:22 crc kubenswrapper[4752]: E0227 17:36:22.730484 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:22 crc kubenswrapper[4752]: E0227 17:36:22.830728 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:22 crc kubenswrapper[4752]: E0227 17:36:22.931664 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:23 crc kubenswrapper[4752]: E0227 17:36:23.032657 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:23 crc kubenswrapper[4752]: E0227 17:36:23.132821 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:23 crc kubenswrapper[4752]: E0227 17:36:23.233509 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:23 crc kubenswrapper[4752]: E0227 17:36:23.334518 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:23 crc kubenswrapper[4752]: E0227 17:36:23.435119 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:23 crc kubenswrapper[4752]: E0227 17:36:23.535810 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:23 crc kubenswrapper[4752]: E0227 17:36:23.636791 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:23 crc kubenswrapper[4752]: E0227 17:36:23.737284 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:23 crc kubenswrapper[4752]: E0227 17:36:23.837977 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:23 crc kubenswrapper[4752]: E0227 17:36:23.938681 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:24 crc kubenswrapper[4752]: E0227 17:36:24.039424 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:24 crc kubenswrapper[4752]: E0227 17:36:24.139959 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:24 crc kubenswrapper[4752]: E0227 17:36:24.240252 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:24 crc kubenswrapper[4752]: E0227 17:36:24.340966 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:24 crc kubenswrapper[4752]: E0227 17:36:24.441084 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:24 crc kubenswrapper[4752]: E0227 17:36:24.542028 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:24 crc kubenswrapper[4752]: E0227 17:36:24.642178 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:24 crc kubenswrapper[4752]: E0227 17:36:24.742914 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:24 crc kubenswrapper[4752]: E0227 17:36:24.844200 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:24 crc kubenswrapper[4752]: I0227 17:36:24.927047 4752 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 27 17:36:24 crc kubenswrapper[4752]: E0227 17:36:24.945047 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:25 crc kubenswrapper[4752]: E0227 17:36:25.046114 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:25 crc kubenswrapper[4752]: E0227 17:36:25.146391 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:25 crc kubenswrapper[4752]: E0227 17:36:25.247332 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:25 crc kubenswrapper[4752]: E0227 17:36:25.348111 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:25 crc kubenswrapper[4752]: E0227 17:36:25.448426 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:25 crc kubenswrapper[4752]: E0227 17:36:25.549295 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:25 crc kubenswrapper[4752]: E0227 17:36:25.650269 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:25 crc kubenswrapper[4752]: E0227 17:36:25.750901 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:25 crc kubenswrapper[4752]: E0227 17:36:25.851889 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:25 crc kubenswrapper[4752]: I0227 17:36:25.905949 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:36:25 crc kubenswrapper[4752]: I0227 17:36:25.907161 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:36:25 crc kubenswrapper[4752]: I0227 17:36:25.907198 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:36:25 crc kubenswrapper[4752]: I0227 17:36:25.907211 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:36:25 crc kubenswrapper[4752]: I0227 17:36:25.907968 4752 scope.go:117] "RemoveContainer" containerID="5f5d9e255d12f08fb6c3de6ea1820963024e5ba5115152042c583a3fbf28b4e3" Feb 27 17:36:25 crc kubenswrapper[4752]: E0227 17:36:25.908198 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 17:36:25 crc kubenswrapper[4752]: E0227 17:36:25.952257 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:26 crc kubenswrapper[4752]: E0227 17:36:26.053093 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:26 crc kubenswrapper[4752]: E0227 17:36:26.154042 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:26 crc kubenswrapper[4752]: E0227 17:36:26.254728 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:26 crc kubenswrapper[4752]: E0227 17:36:26.355282 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:26 crc kubenswrapper[4752]: E0227 17:36:26.455938 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:26 crc kubenswrapper[4752]: E0227 17:36:26.556996 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:26 crc kubenswrapper[4752]: E0227 17:36:26.657209 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:26 crc kubenswrapper[4752]: E0227 17:36:26.758305 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:26 crc kubenswrapper[4752]: E0227 17:36:26.858792 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:26 crc kubenswrapper[4752]: E0227 17:36:26.959404 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:27 crc kubenswrapper[4752]: E0227 17:36:27.059922 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:27 crc kubenswrapper[4752]: E0227 17:36:27.161021 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:27 crc kubenswrapper[4752]: E0227 17:36:27.262072 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:27 crc kubenswrapper[4752]: E0227 17:36:27.362627 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:27 crc kubenswrapper[4752]: E0227 17:36:27.463101 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:27 crc kubenswrapper[4752]: E0227 17:36:27.564138 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:27 crc kubenswrapper[4752]: E0227 17:36:27.664659 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:27 crc kubenswrapper[4752]: E0227 17:36:27.764816 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:27 crc kubenswrapper[4752]: E0227 17:36:27.865371 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:27 crc kubenswrapper[4752]: E0227 17:36:27.905813 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 27 17:36:27 crc kubenswrapper[4752]: I0227 17:36:27.912107 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:36:27 crc kubenswrapper[4752]: I0227 17:36:27.912180 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:36:27 crc kubenswrapper[4752]: I0227 17:36:27.912196 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:36:27 crc kubenswrapper[4752]: I0227 17:36:27.912217 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:36:27 crc kubenswrapper[4752]: I0227 17:36:27.912233 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:36:27Z","lastTransitionTime":"2026-02-27T17:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:36:27 crc kubenswrapper[4752]: E0227 17:36:27.928716 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 17:36:27 crc kubenswrapper[4752]: I0227 17:36:27.934704 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:36:27 crc kubenswrapper[4752]: I0227 17:36:27.934777 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:36:27 crc kubenswrapper[4752]: I0227 17:36:27.934796 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:36:27 crc kubenswrapper[4752]: I0227 17:36:27.934831 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:36:27 crc kubenswrapper[4752]: I0227 17:36:27.934852 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:36:27Z","lastTransitionTime":"2026-02-27T17:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:36:27 crc kubenswrapper[4752]: E0227 17:36:27.951983 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 17:36:27 crc kubenswrapper[4752]: I0227 17:36:27.957621 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:36:27 crc kubenswrapper[4752]: I0227 17:36:27.957683 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:36:27 crc kubenswrapper[4752]: I0227 17:36:27.957705 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:36:27 crc kubenswrapper[4752]: I0227 17:36:27.957733 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:36:27 crc kubenswrapper[4752]: I0227 17:36:27.957753 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:36:27Z","lastTransitionTime":"2026-02-27T17:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:36:27 crc kubenswrapper[4752]: E0227 17:36:27.970949 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 17:36:27 crc kubenswrapper[4752]: I0227 17:36:27.976052 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:36:27 crc kubenswrapper[4752]: I0227 17:36:27.976173 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:36:27 crc kubenswrapper[4752]: I0227 17:36:27.976196 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:36:27 crc kubenswrapper[4752]: I0227 17:36:27.976223 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:36:27 crc kubenswrapper[4752]: I0227 17:36:27.976241 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:36:27Z","lastTransitionTime":"2026-02-27T17:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:36:27 crc kubenswrapper[4752]: E0227 17:36:27.994510 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 17:36:27 crc kubenswrapper[4752]: E0227 17:36:27.994736 4752 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 17:36:27 crc kubenswrapper[4752]: E0227 17:36:27.994787 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:28 crc kubenswrapper[4752]: E0227 17:36:28.095778 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:28 crc kubenswrapper[4752]: E0227 17:36:28.195984 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:28 crc kubenswrapper[4752]: E0227 17:36:28.297126 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:28 crc kubenswrapper[4752]: E0227 17:36:28.398355 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:28 crc kubenswrapper[4752]: E0227 17:36:28.498695 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:28 crc kubenswrapper[4752]: E0227 17:36:28.599942 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:28 crc kubenswrapper[4752]: E0227 17:36:28.700564 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:28 crc kubenswrapper[4752]: E0227 17:36:28.800798 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:28 crc kubenswrapper[4752]: E0227 17:36:28.901292 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:29 crc kubenswrapper[4752]: E0227 17:36:29.001791 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:29 crc kubenswrapper[4752]: E0227 17:36:29.102246 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:29 crc kubenswrapper[4752]: E0227 17:36:29.203288 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:29 crc kubenswrapper[4752]: E0227 17:36:29.303497 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:29 crc kubenswrapper[4752]: E0227 17:36:29.404551 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:29 crc kubenswrapper[4752]: E0227 17:36:29.505459 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:29 crc kubenswrapper[4752]: E0227 17:36:29.606628 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:29 crc kubenswrapper[4752]: E0227 17:36:29.707788 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:29 crc kubenswrapper[4752]: E0227 17:36:29.808554 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:29 crc kubenswrapper[4752]: E0227 17:36:29.908964 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:30 crc kubenswrapper[4752]: E0227 17:36:30.010059 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:30 crc kubenswrapper[4752]: E0227 17:36:30.110808 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:30 crc kubenswrapper[4752]: E0227 17:36:30.210991 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:30 crc kubenswrapper[4752]: E0227 17:36:30.311124 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:30 crc kubenswrapper[4752]: E0227 17:36:30.411267 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:30 crc kubenswrapper[4752]: E0227 17:36:30.512378 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:30 crc kubenswrapper[4752]: E0227 17:36:30.613504 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:30 crc kubenswrapper[4752]: E0227 17:36:30.713769 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:30 crc kubenswrapper[4752]: E0227 17:36:30.814724 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:30 crc kubenswrapper[4752]: E0227 17:36:30.915658 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:31 crc kubenswrapper[4752]: E0227 17:36:31.002352 4752 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 17:36:31 crc kubenswrapper[4752]: E0227 17:36:31.016623 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:31 crc kubenswrapper[4752]: E0227 17:36:31.117204 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:31 crc kubenswrapper[4752]: E0227 17:36:31.218249 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:31 crc kubenswrapper[4752]: E0227 17:36:31.318807 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:31 crc kubenswrapper[4752]: E0227 17:36:31.419680 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:31 crc kubenswrapper[4752]: E0227 17:36:31.519786 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:31 crc kubenswrapper[4752]: E0227 17:36:31.620544 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:31 crc kubenswrapper[4752]: E0227 17:36:31.720927 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:31 crc kubenswrapper[4752]: E0227 17:36:31.821346 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:31 crc kubenswrapper[4752]: E0227 17:36:31.922334 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:32 crc kubenswrapper[4752]: E0227 17:36:32.022661 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:32 crc kubenswrapper[4752]: E0227 17:36:32.123726 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:32 crc kubenswrapper[4752]: E0227 17:36:32.224717 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:32 crc kubenswrapper[4752]: E0227 17:36:32.325553 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:32 crc kubenswrapper[4752]: E0227 17:36:32.425699 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:32 crc kubenswrapper[4752]: E0227 17:36:32.526268 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:32 crc kubenswrapper[4752]: E0227 17:36:32.626568 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:32 crc kubenswrapper[4752]: E0227 17:36:32.726768 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:32 crc kubenswrapper[4752]: E0227 17:36:32.827408 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:32 crc kubenswrapper[4752]: E0227 17:36:32.928182 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:33 crc kubenswrapper[4752]: E0227 17:36:33.028762 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:33 crc kubenswrapper[4752]: E0227 17:36:33.129229 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:33 crc kubenswrapper[4752]: E0227 17:36:33.229408 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:33 crc kubenswrapper[4752]: E0227 17:36:33.329512 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:33 crc kubenswrapper[4752]: E0227 17:36:33.430317 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:33 crc kubenswrapper[4752]: I0227 17:36:33.506561 4752 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 27 17:36:33 crc kubenswrapper[4752]: E0227 17:36:33.530880 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:33 crc kubenswrapper[4752]: E0227 17:36:33.631880 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:33 crc kubenswrapper[4752]: E0227 17:36:33.732012 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:33 crc kubenswrapper[4752]: E0227 17:36:33.832612 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:33 crc kubenswrapper[4752]: E0227 17:36:33.933528 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:34 crc kubenswrapper[4752]: E0227 17:36:34.034574 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:34 crc kubenswrapper[4752]: E0227 17:36:34.135707 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:34 crc kubenswrapper[4752]: E0227 17:36:34.236354 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:34 crc kubenswrapper[4752]: E0227 17:36:34.337450 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:34 crc kubenswrapper[4752]: E0227 17:36:34.437814 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:34 crc kubenswrapper[4752]: E0227 17:36:34.538732 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:34 crc kubenswrapper[4752]: E0227 17:36:34.639229 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:34 crc kubenswrapper[4752]: E0227 17:36:34.739369 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:34 crc kubenswrapper[4752]: E0227 17:36:34.840483 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:34 crc kubenswrapper[4752]: E0227 17:36:34.941212 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:35 crc kubenswrapper[4752]: E0227 17:36:35.041602 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:35 crc kubenswrapper[4752]: E0227 17:36:35.141723 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:35 crc kubenswrapper[4752]: E0227 17:36:35.242277 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:35 crc kubenswrapper[4752]: E0227 17:36:35.343049 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:35 crc kubenswrapper[4752]: E0227 17:36:35.444186 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:35 crc kubenswrapper[4752]: E0227 17:36:35.544420 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:35 crc kubenswrapper[4752]: E0227 17:36:35.644565 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:35 crc kubenswrapper[4752]: E0227 17:36:35.745220 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:35 crc kubenswrapper[4752]: E0227 17:36:35.846311 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:35 crc kubenswrapper[4752]: E0227 17:36:35.946856 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:36 crc kubenswrapper[4752]: E0227 17:36:36.046978 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:36 crc kubenswrapper[4752]: E0227 17:36:36.147625 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:36 crc kubenswrapper[4752]: E0227 17:36:36.248509 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:36 crc kubenswrapper[4752]: E0227 17:36:36.348828 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:36 crc kubenswrapper[4752]: E0227 17:36:36.449046 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:36 crc kubenswrapper[4752]: E0227 17:36:36.549941 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:36 crc kubenswrapper[4752]: E0227 17:36:36.650729 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:36 crc kubenswrapper[4752]: E0227 17:36:36.751783 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:36 crc kubenswrapper[4752]: E0227 17:36:36.852339 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:36 crc kubenswrapper[4752]: E0227 17:36:36.952946 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:37 crc kubenswrapper[4752]: E0227 17:36:37.053730 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:37 crc kubenswrapper[4752]: E0227 17:36:37.155319 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:37 crc kubenswrapper[4752]: E0227 17:36:37.256429 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:37 crc kubenswrapper[4752]: E0227 17:36:37.356525 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:37 crc kubenswrapper[4752]: E0227 17:36:37.457677 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:37 crc kubenswrapper[4752]: E0227 17:36:37.558411 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:37 crc kubenswrapper[4752]: E0227 17:36:37.658522 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:37 crc kubenswrapper[4752]: E0227 17:36:37.758653 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:37 crc kubenswrapper[4752]: E0227 17:36:37.858832 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:37 crc kubenswrapper[4752]: I0227 17:36:37.905803 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:36:37 crc kubenswrapper[4752]: I0227 17:36:37.907325 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:36:37 crc kubenswrapper[4752]: I0227 17:36:37.907378 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:36:37 crc kubenswrapper[4752]: I0227 17:36:37.907391 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:36:37 crc kubenswrapper[4752]: I0227 17:36:37.908027 4752 scope.go:117] "RemoveContainer" containerID="5f5d9e255d12f08fb6c3de6ea1820963024e5ba5115152042c583a3fbf28b4e3" Feb 27 17:36:37 crc kubenswrapper[4752]: E0227 17:36:37.908438 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 17:36:37 crc kubenswrapper[4752]: E0227 17:36:37.960045 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:38 crc kubenswrapper[4752]: E0227 17:36:38.060798 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:38 crc kubenswrapper[4752]: E0227 17:36:38.162064 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:38 crc kubenswrapper[4752]: E0227 17:36:38.262440 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:38 crc kubenswrapper[4752]: E0227 17:36:38.363189 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:38 crc kubenswrapper[4752]: E0227 17:36:38.372367 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 27 17:36:38 crc kubenswrapper[4752]: I0227 17:36:38.376424 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:36:38 crc kubenswrapper[4752]: I0227 17:36:38.376485 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:36:38 crc kubenswrapper[4752]: I0227 17:36:38.376503 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:36:38 crc kubenswrapper[4752]: I0227 17:36:38.376531 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:36:38 crc kubenswrapper[4752]: I0227 17:36:38.376548 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:36:38Z","lastTransitionTime":"2026-02-27T17:36:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:36:38 crc kubenswrapper[4752]: E0227 17:36:38.392555 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 17:36:38 crc kubenswrapper[4752]: I0227 17:36:38.397416 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:36:38 crc kubenswrapper[4752]: I0227 17:36:38.397491 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:36:38 crc kubenswrapper[4752]: I0227 17:36:38.397516 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:36:38 crc kubenswrapper[4752]: I0227 17:36:38.397547 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:36:38 crc kubenswrapper[4752]: I0227 17:36:38.397569 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:36:38Z","lastTransitionTime":"2026-02-27T17:36:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:36:38 crc kubenswrapper[4752]: E0227 17:36:38.414580 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 17:36:38 crc kubenswrapper[4752]: I0227 17:36:38.419814 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:36:38 crc kubenswrapper[4752]: I0227 17:36:38.419874 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:36:38 crc kubenswrapper[4752]: I0227 17:36:38.419892 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:36:38 crc kubenswrapper[4752]: I0227 17:36:38.419915 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:36:38 crc kubenswrapper[4752]: I0227 17:36:38.419935 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:36:38Z","lastTransitionTime":"2026-02-27T17:36:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:36:38 crc kubenswrapper[4752]: E0227 17:36:38.434662 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 17:36:38 crc kubenswrapper[4752]: I0227 17:36:38.441463 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:36:38 crc kubenswrapper[4752]: I0227 17:36:38.441515 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:36:38 crc kubenswrapper[4752]: I0227 17:36:38.441536 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:36:38 crc kubenswrapper[4752]: I0227 17:36:38.441563 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:36:38 crc kubenswrapper[4752]: I0227 17:36:38.441581 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:36:38Z","lastTransitionTime":"2026-02-27T17:36:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:36:38 crc kubenswrapper[4752]: E0227 17:36:38.457076 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 17:36:38 crc kubenswrapper[4752]: E0227 17:36:38.457365 4752 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 17:36:38 crc kubenswrapper[4752]: E0227 17:36:38.464053 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:38 crc kubenswrapper[4752]: E0227 17:36:38.565085 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:38 crc kubenswrapper[4752]: E0227 17:36:38.665381 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:38 crc kubenswrapper[4752]: E0227 17:36:38.766040 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:38 crc kubenswrapper[4752]: E0227 17:36:38.866921 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:38 crc kubenswrapper[4752]: E0227 17:36:38.967841 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:39 crc kubenswrapper[4752]: E0227 17:36:39.068337 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:39 crc kubenswrapper[4752]: E0227 17:36:39.169230 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:39 crc kubenswrapper[4752]: E0227 17:36:39.270118 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:39 crc kubenswrapper[4752]: E0227 17:36:39.370694 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:39 crc kubenswrapper[4752]: E0227 17:36:39.471045 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:39 crc kubenswrapper[4752]: E0227 17:36:39.571703 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:39 crc kubenswrapper[4752]: E0227 17:36:39.672818 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:39 crc kubenswrapper[4752]: E0227 17:36:39.773328 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:39 crc kubenswrapper[4752]: E0227 17:36:39.873585 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:39 crc kubenswrapper[4752]: E0227 17:36:39.974580 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:40 crc kubenswrapper[4752]: E0227 17:36:40.075622 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:40 crc kubenswrapper[4752]: E0227 17:36:40.176744 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:40 crc kubenswrapper[4752]: E0227 17:36:40.277487 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:40 crc kubenswrapper[4752]: E0227 17:36:40.378592 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:40 crc kubenswrapper[4752]: E0227 17:36:40.479565 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:40 crc kubenswrapper[4752]: E0227 17:36:40.579978 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:40 crc kubenswrapper[4752]: E0227 17:36:40.680491 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:40 crc kubenswrapper[4752]: E0227 17:36:40.780990 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:40 crc kubenswrapper[4752]: E0227 17:36:40.881683 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:40 crc kubenswrapper[4752]: E0227 17:36:40.982422 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:41 crc kubenswrapper[4752]: E0227 17:36:41.002493 4752 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 17:36:41 crc kubenswrapper[4752]: E0227 17:36:41.083429 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:41 crc kubenswrapper[4752]: E0227 17:36:41.183559 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:41 crc kubenswrapper[4752]: E0227 17:36:41.284567 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:41 crc kubenswrapper[4752]: E0227 17:36:41.385216 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:41 crc kubenswrapper[4752]: E0227 17:36:41.486336 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:41 crc kubenswrapper[4752]: E0227 17:36:41.587966 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:41 crc kubenswrapper[4752]: E0227 17:36:41.688591 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:41 crc kubenswrapper[4752]: E0227 17:36:41.789430 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:41 crc kubenswrapper[4752]: E0227 17:36:41.890315 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:41 crc kubenswrapper[4752]: E0227 17:36:41.991244 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:42 crc kubenswrapper[4752]: E0227 17:36:42.091816 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:42 crc kubenswrapper[4752]: E0227 17:36:42.192210 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:42 crc kubenswrapper[4752]: E0227 17:36:42.292403 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:42 crc kubenswrapper[4752]: E0227 17:36:42.392526 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:42 crc kubenswrapper[4752]: E0227 17:36:42.492624 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:42 crc kubenswrapper[4752]: E0227 17:36:42.593535 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:42 crc kubenswrapper[4752]: E0227 17:36:42.693818 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:42 crc kubenswrapper[4752]: E0227 17:36:42.794625 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:42 crc kubenswrapper[4752]: E0227 17:36:42.895577 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:42 crc kubenswrapper[4752]: I0227 17:36:42.905991 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:36:42 crc kubenswrapper[4752]: I0227 17:36:42.907308 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:36:42 crc kubenswrapper[4752]: I0227 17:36:42.907336 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:36:42 crc kubenswrapper[4752]: I0227 17:36:42.907345 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:36:42 crc kubenswrapper[4752]: E0227 17:36:42.995964 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:43 crc kubenswrapper[4752]: E0227 17:36:43.097055 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:43 crc kubenswrapper[4752]: E0227 17:36:43.197416 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:43 crc kubenswrapper[4752]: E0227 17:36:43.297546 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:43 crc kubenswrapper[4752]: E0227 17:36:43.398304 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:43 crc kubenswrapper[4752]: E0227 17:36:43.499265 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:43 crc kubenswrapper[4752]: E0227 17:36:43.600111 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:43 crc kubenswrapper[4752]: E0227 17:36:43.700671 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:43 crc kubenswrapper[4752]: E0227 17:36:43.801698 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:43 crc kubenswrapper[4752]: E0227 17:36:43.902500 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:44 crc kubenswrapper[4752]: E0227 17:36:44.003028 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:44 crc kubenswrapper[4752]: E0227 17:36:44.104024 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:44 crc kubenswrapper[4752]: E0227 17:36:44.205229 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:44 crc kubenswrapper[4752]: E0227 17:36:44.306350 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:44 crc kubenswrapper[4752]: E0227 17:36:44.406522 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:44 crc kubenswrapper[4752]: E0227 17:36:44.507564 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:44 crc kubenswrapper[4752]: E0227 17:36:44.608665 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:44 crc kubenswrapper[4752]: E0227 17:36:44.709180 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:44 crc kubenswrapper[4752]: E0227 17:36:44.809416 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:44 crc kubenswrapper[4752]: E0227 17:36:44.910442 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:45 crc kubenswrapper[4752]: E0227 17:36:45.010616 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:45 crc kubenswrapper[4752]: E0227 17:36:45.110736 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:45 crc kubenswrapper[4752]: E0227 17:36:45.211826 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:45 crc kubenswrapper[4752]: E0227 17:36:45.311996 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:45 crc kubenswrapper[4752]: E0227 17:36:45.412638 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:45 crc kubenswrapper[4752]: E0227 17:36:45.513732 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:45 crc kubenswrapper[4752]: E0227 17:36:45.613862 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:45 crc kubenswrapper[4752]: E0227 17:36:45.714553 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:45 crc kubenswrapper[4752]: E0227 17:36:45.814733 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:45 crc kubenswrapper[4752]: E0227 17:36:45.915126 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:46 crc kubenswrapper[4752]: E0227 17:36:46.015564 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:46 crc kubenswrapper[4752]: E0227 17:36:46.115897 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:46 crc kubenswrapper[4752]: E0227 17:36:46.216282 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:46 crc kubenswrapper[4752]: E0227 17:36:46.317198 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:46 crc kubenswrapper[4752]: E0227 17:36:46.418289 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:46 crc kubenswrapper[4752]: E0227 17:36:46.519468 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:46 crc kubenswrapper[4752]: E0227 17:36:46.619984 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:46 crc kubenswrapper[4752]: E0227 17:36:46.720446 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:46 crc kubenswrapper[4752]: E0227 17:36:46.820650 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:46 crc kubenswrapper[4752]: E0227 17:36:46.921476 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:47 crc kubenswrapper[4752]: E0227 17:36:47.022460 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:47 crc kubenswrapper[4752]: E0227 17:36:47.123603 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:47 crc kubenswrapper[4752]: E0227 17:36:47.224120 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:47 crc kubenswrapper[4752]: E0227 17:36:47.324648 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:47 crc kubenswrapper[4752]: E0227 17:36:47.425624 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:47 crc kubenswrapper[4752]: E0227 17:36:47.526615 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:47 crc kubenswrapper[4752]: E0227 17:36:47.627269 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:47 crc kubenswrapper[4752]: E0227 17:36:47.728452 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:47 crc kubenswrapper[4752]: E0227 17:36:47.829630 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:47 crc kubenswrapper[4752]: E0227 17:36:47.930330 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:48 crc kubenswrapper[4752]: E0227 17:36:48.030943 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:48 crc kubenswrapper[4752]: E0227 17:36:48.131232 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:48 crc kubenswrapper[4752]: E0227 17:36:48.232214 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:48 crc kubenswrapper[4752]: E0227 17:36:48.332328 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:48 crc kubenswrapper[4752]: E0227 17:36:48.433038 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:48 crc kubenswrapper[4752]: E0227 17:36:48.533997 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:48 crc kubenswrapper[4752]: E0227 17:36:48.635035 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:48 crc kubenswrapper[4752]: E0227 17:36:48.716823 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 27 17:36:48 crc kubenswrapper[4752]: I0227 17:36:48.721423 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:36:48 crc kubenswrapper[4752]: I0227 17:36:48.721471 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:36:48 crc kubenswrapper[4752]: I0227 17:36:48.721482 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:36:48 crc kubenswrapper[4752]: I0227 17:36:48.721498 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:36:48 crc kubenswrapper[4752]: I0227 17:36:48.721509 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:36:48Z","lastTransitionTime":"2026-02-27T17:36:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:36:48 crc kubenswrapper[4752]: E0227 17:36:48.734716 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 17:36:48 crc kubenswrapper[4752]: I0227 17:36:48.739847 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:36:48 crc kubenswrapper[4752]: I0227 17:36:48.739933 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:36:48 crc kubenswrapper[4752]: I0227 17:36:48.739960 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:36:48 crc kubenswrapper[4752]: I0227 17:36:48.739996 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:36:48 crc kubenswrapper[4752]: I0227 17:36:48.740035 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:36:48Z","lastTransitionTime":"2026-02-27T17:36:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:36:48 crc kubenswrapper[4752]: E0227 17:36:48.755827 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 17:36:48 crc kubenswrapper[4752]: I0227 17:36:48.759850 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:36:48 crc kubenswrapper[4752]: I0227 17:36:48.759900 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:36:48 crc kubenswrapper[4752]: I0227 17:36:48.759912 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:36:48 crc kubenswrapper[4752]: I0227 17:36:48.759928 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:36:48 crc kubenswrapper[4752]: I0227 17:36:48.759938 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:36:48Z","lastTransitionTime":"2026-02-27T17:36:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:36:48 crc kubenswrapper[4752]: E0227 17:36:48.770079 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 17:36:48 crc kubenswrapper[4752]: I0227 17:36:48.773835 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:36:48 crc kubenswrapper[4752]: I0227 17:36:48.774025 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:36:48 crc kubenswrapper[4752]: I0227 17:36:48.774121 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:36:48 crc kubenswrapper[4752]: I0227 17:36:48.774247 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:36:48 crc kubenswrapper[4752]: I0227 17:36:48.774346 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:36:48Z","lastTransitionTime":"2026-02-27T17:36:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:36:48 crc kubenswrapper[4752]: E0227 17:36:48.788737 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 17:36:48 crc kubenswrapper[4752]: E0227 17:36:48.788958 4752 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 17:36:48 crc kubenswrapper[4752]: E0227 17:36:48.789003 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:48 crc kubenswrapper[4752]: E0227 17:36:48.889740 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:48 crc kubenswrapper[4752]: E0227 17:36:48.990695 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:49 crc kubenswrapper[4752]: E0227 17:36:49.091620 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:49 crc kubenswrapper[4752]: E0227 17:36:49.192381 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:49 crc kubenswrapper[4752]: E0227 17:36:49.293285 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:49 crc kubenswrapper[4752]: E0227 17:36:49.394365 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:49 crc kubenswrapper[4752]: E0227 17:36:49.495263 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:49 crc kubenswrapper[4752]: E0227 17:36:49.596073 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:49 crc kubenswrapper[4752]: E0227 17:36:49.697237 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:49 crc kubenswrapper[4752]: E0227 17:36:49.798387 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:49 crc kubenswrapper[4752]: E0227 17:36:49.899275 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:49 crc kubenswrapper[4752]: E0227 17:36:49.999876 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:50 crc kubenswrapper[4752]: E0227 17:36:50.100032 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:50 crc kubenswrapper[4752]: E0227 17:36:50.201234 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:50 crc kubenswrapper[4752]: E0227 17:36:50.302211 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:50 crc kubenswrapper[4752]: E0227 17:36:50.403273 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:50 crc kubenswrapper[4752]: E0227 17:36:50.504242 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:50 crc kubenswrapper[4752]: E0227 17:36:50.605162 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:50 crc kubenswrapper[4752]: E0227 17:36:50.705776 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:50 crc kubenswrapper[4752]: E0227 17:36:50.806684 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:50 crc kubenswrapper[4752]: E0227 17:36:50.907398 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:51 crc kubenswrapper[4752]: E0227 17:36:51.002633 4752 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 17:36:51 crc kubenswrapper[4752]: E0227 17:36:51.007547 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:51 crc kubenswrapper[4752]: E0227 17:36:51.108802 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:51 crc kubenswrapper[4752]: E0227 17:36:51.209634 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:51 crc kubenswrapper[4752]: E0227 17:36:51.310016 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:51 crc kubenswrapper[4752]: E0227 17:36:51.411053 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:51 crc kubenswrapper[4752]: E0227 17:36:51.511855 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:51 crc kubenswrapper[4752]: E0227 17:36:51.612478 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:51 crc kubenswrapper[4752]: E0227 17:36:51.713425 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:51 crc kubenswrapper[4752]: E0227 17:36:51.814340 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:51 crc kubenswrapper[4752]: E0227 17:36:51.914826 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:52 crc kubenswrapper[4752]: E0227 17:36:52.016064 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:52 crc kubenswrapper[4752]: E0227 17:36:52.117221 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:52 crc kubenswrapper[4752]: E0227 17:36:52.218621 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:52 crc kubenswrapper[4752]: E0227 17:36:52.319841 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:52 crc kubenswrapper[4752]: E0227 17:36:52.420675 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:52 crc kubenswrapper[4752]: E0227 17:36:52.521562 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:52 crc kubenswrapper[4752]: E0227 17:36:52.622360 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:52 crc kubenswrapper[4752]: E0227 17:36:52.722476 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:52 crc kubenswrapper[4752]: E0227 17:36:52.822841 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:52 crc kubenswrapper[4752]: I0227 17:36:52.906691 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:36:52 crc kubenswrapper[4752]: I0227 17:36:52.908242 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:36:52 crc kubenswrapper[4752]: I0227 17:36:52.908307 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:36:52 crc kubenswrapper[4752]: I0227 17:36:52.908332 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:36:52 crc kubenswrapper[4752]: I0227 17:36:52.909275 4752 scope.go:117] "RemoveContainer" containerID="5f5d9e255d12f08fb6c3de6ea1820963024e5ba5115152042c583a3fbf28b4e3" Feb 27 17:36:52 crc kubenswrapper[4752]: E0227 17:36:52.925998 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:53 crc kubenswrapper[4752]: E0227 17:36:53.026744 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:53 crc kubenswrapper[4752]: E0227 17:36:53.127557 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:53 crc kubenswrapper[4752]: E0227 17:36:53.227728 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:53 crc kubenswrapper[4752]: E0227 17:36:53.328595 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:53 crc kubenswrapper[4752]: I0227 17:36:53.369807 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 27 17:36:53 crc kubenswrapper[4752]: I0227 17:36:53.371857 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0"} Feb 27 17:36:53 crc kubenswrapper[4752]: I0227 17:36:53.372225 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:36:53 crc kubenswrapper[4752]: I0227 17:36:53.373729 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:36:53 crc kubenswrapper[4752]: I0227 17:36:53.373767 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:36:53 crc kubenswrapper[4752]: I0227 17:36:53.373776 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:36:53 crc kubenswrapper[4752]: E0227 17:36:53.429576 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:53 crc kubenswrapper[4752]: E0227 17:36:53.530347 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:53 crc kubenswrapper[4752]: E0227 17:36:53.630944 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:53 crc kubenswrapper[4752]: E0227 17:36:53.731986 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:53 crc kubenswrapper[4752]: E0227 17:36:53.833082 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:53 crc kubenswrapper[4752]: E0227 17:36:53.934073 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:54 crc kubenswrapper[4752]: E0227 17:36:54.034994 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:54 crc kubenswrapper[4752]: E0227 17:36:54.136064 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:54 crc kubenswrapper[4752]: I0227 17:36:54.186781 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:36:54 crc kubenswrapper[4752]: E0227 17:36:54.236722 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:54 crc kubenswrapper[4752]: E0227 17:36:54.337462 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:54 crc kubenswrapper[4752]: I0227 17:36:54.381394 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Feb 27 17:36:54 crc kubenswrapper[4752]: I0227 17:36:54.382759 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 27 17:36:54 crc kubenswrapper[4752]: I0227 17:36:54.385579 4752 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0" exitCode=255 Feb 27 17:36:54 crc kubenswrapper[4752]: I0227 17:36:54.385644 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0"} Feb 27 17:36:54 crc kubenswrapper[4752]: I0227 17:36:54.385703 4752 scope.go:117] "RemoveContainer" containerID="5f5d9e255d12f08fb6c3de6ea1820963024e5ba5115152042c583a3fbf28b4e3" Feb 27 17:36:54 crc kubenswrapper[4752]: I0227 17:36:54.385743 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:36:54 crc kubenswrapper[4752]: I0227 17:36:54.387199 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:36:54 crc kubenswrapper[4752]: I0227 17:36:54.387248 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:36:54 crc kubenswrapper[4752]: I0227 17:36:54.387267 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:36:54 crc kubenswrapper[4752]: I0227 17:36:54.388203 4752 scope.go:117] "RemoveContainer" containerID="847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0" Feb 27 17:36:54 crc kubenswrapper[4752]: E0227 17:36:54.388535 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 17:36:54 crc kubenswrapper[4752]: E0227 17:36:54.438563 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:54 crc kubenswrapper[4752]: E0227 17:36:54.539507 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:54 crc kubenswrapper[4752]: I0227 17:36:54.610095 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:36:54 crc kubenswrapper[4752]: E0227 17:36:54.640693 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:54 crc kubenswrapper[4752]: E0227 17:36:54.741614 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:54 crc kubenswrapper[4752]: E0227 17:36:54.842757 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:54 crc kubenswrapper[4752]: E0227 17:36:54.943235 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:55 crc kubenswrapper[4752]: E0227 17:36:55.043650 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:55 crc kubenswrapper[4752]: E0227 17:36:55.143876 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:55 crc kubenswrapper[4752]: E0227 17:36:55.245085 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:55 crc kubenswrapper[4752]: E0227 17:36:55.346286 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:55 crc kubenswrapper[4752]: I0227 17:36:55.391576 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Feb 27 17:36:55 crc kubenswrapper[4752]: I0227 17:36:55.394367 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:36:55 crc kubenswrapper[4752]: I0227 17:36:55.395503 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:36:55 crc kubenswrapper[4752]: I0227 17:36:55.395543 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:36:55 crc kubenswrapper[4752]: I0227 17:36:55.395563 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:36:55 crc kubenswrapper[4752]: I0227 17:36:55.396617 4752 scope.go:117] "RemoveContainer" containerID="847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0" Feb 27 17:36:55 crc kubenswrapper[4752]: E0227 17:36:55.396914 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 17:36:55 crc kubenswrapper[4752]: E0227 17:36:55.446764 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:55 crc kubenswrapper[4752]: E0227 17:36:55.547669 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:55 crc kubenswrapper[4752]: E0227 17:36:55.648220 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:55 crc kubenswrapper[4752]: E0227 17:36:55.749067 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:55 crc kubenswrapper[4752]: E0227 17:36:55.849492 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:55 crc kubenswrapper[4752]: I0227 17:36:55.906276 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:36:55 crc kubenswrapper[4752]: I0227 17:36:55.908427 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:36:55 crc kubenswrapper[4752]: I0227 17:36:55.908497 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:36:55 crc kubenswrapper[4752]: I0227 17:36:55.908521 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:36:55 crc kubenswrapper[4752]: E0227 17:36:55.950033 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:56 crc kubenswrapper[4752]: E0227 17:36:56.050430 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:56 crc kubenswrapper[4752]: E0227 17:36:56.151276 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:56 crc kubenswrapper[4752]: E0227 17:36:56.252478 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:56 crc kubenswrapper[4752]: E0227 17:36:56.352858 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:56 crc kubenswrapper[4752]: I0227 17:36:56.397341 4752 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 27 17:36:56 crc kubenswrapper[4752]: I0227 17:36:56.398846 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:36:56 crc kubenswrapper[4752]: I0227 17:36:56.398898 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:36:56 crc kubenswrapper[4752]: I0227 17:36:56.398914 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:36:56 crc kubenswrapper[4752]: I0227 17:36:56.399911 4752 scope.go:117] "RemoveContainer" containerID="847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0" Feb 27 17:36:56 crc kubenswrapper[4752]: E0227 17:36:56.400320 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 17:36:56 crc kubenswrapper[4752]: E0227 17:36:56.453911 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:56 crc kubenswrapper[4752]: E0227 17:36:56.556756 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:56 crc kubenswrapper[4752]: E0227 17:36:56.657865 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:56 crc kubenswrapper[4752]: E0227 17:36:56.758918 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:56 crc kubenswrapper[4752]: E0227 17:36:56.859230 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:56 crc kubenswrapper[4752]: E0227 17:36:56.961275 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:57 crc kubenswrapper[4752]: E0227 17:36:57.062423 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:57 crc kubenswrapper[4752]: E0227 17:36:57.163286 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:57 crc kubenswrapper[4752]: E0227 17:36:57.264100 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:57 crc kubenswrapper[4752]: E0227 17:36:57.365230 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:57 crc kubenswrapper[4752]: E0227 17:36:57.465622 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:57 crc kubenswrapper[4752]: E0227 17:36:57.566745 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:57 crc kubenswrapper[4752]: E0227 17:36:57.667454 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:57 crc kubenswrapper[4752]: E0227 17:36:57.768092 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:57 crc kubenswrapper[4752]: E0227 17:36:57.869050 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:57 crc kubenswrapper[4752]: E0227 17:36:57.969754 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:58 crc kubenswrapper[4752]: E0227 17:36:58.070696 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:58 crc kubenswrapper[4752]: E0227 17:36:58.171280 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:58 crc kubenswrapper[4752]: E0227 17:36:58.272306 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:58 crc kubenswrapper[4752]: E0227 17:36:58.373472 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:58 crc kubenswrapper[4752]: E0227 17:36:58.474210 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:58 crc kubenswrapper[4752]: E0227 17:36:58.575004 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:58 crc kubenswrapper[4752]: E0227 17:36:58.676097 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:58 crc kubenswrapper[4752]: E0227 17:36:58.776793 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:58 crc kubenswrapper[4752]: E0227 17:36:58.801059 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 27 17:36:58 crc kubenswrapper[4752]: I0227 17:36:58.806319 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:36:58 crc kubenswrapper[4752]: I0227 17:36:58.806358 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:36:58 crc kubenswrapper[4752]: I0227 17:36:58.806370 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:36:58 crc kubenswrapper[4752]: I0227 17:36:58.806387 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:36:58 crc kubenswrapper[4752]: I0227 17:36:58.806400 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:36:58Z","lastTransitionTime":"2026-02-27T17:36:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:36:58 crc kubenswrapper[4752]: E0227 17:36:58.822342 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 17:36:58 crc kubenswrapper[4752]: I0227 17:36:58.828126 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:36:58 crc kubenswrapper[4752]: I0227 17:36:58.828175 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:36:58 crc kubenswrapper[4752]: I0227 17:36:58.828187 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:36:58 crc kubenswrapper[4752]: I0227 17:36:58.828201 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:36:58 crc kubenswrapper[4752]: I0227 17:36:58.828213 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:36:58Z","lastTransitionTime":"2026-02-27T17:36:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:36:58 crc kubenswrapper[4752]: E0227 17:36:58.842887 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 17:36:58 crc kubenswrapper[4752]: I0227 17:36:58.851694 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:36:58 crc kubenswrapper[4752]: I0227 17:36:58.851857 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:36:58 crc kubenswrapper[4752]: I0227 17:36:58.851979 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:36:58 crc kubenswrapper[4752]: I0227 17:36:58.852105 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:36:58 crc kubenswrapper[4752]: I0227 17:36:58.852224 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:36:58Z","lastTransitionTime":"2026-02-27T17:36:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:36:58 crc kubenswrapper[4752]: E0227 17:36:58.865193 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 17:36:58 crc kubenswrapper[4752]: I0227 17:36:58.869762 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:36:58 crc kubenswrapper[4752]: I0227 17:36:58.870322 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:36:58 crc kubenswrapper[4752]: I0227 17:36:58.870418 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:36:58 crc kubenswrapper[4752]: I0227 17:36:58.870505 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:36:58 crc kubenswrapper[4752]: I0227 17:36:58.870595 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:36:58Z","lastTransitionTime":"2026-02-27T17:36:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:36:58 crc kubenswrapper[4752]: E0227 17:36:58.884822 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:36:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 17:36:58 crc kubenswrapper[4752]: E0227 17:36:58.885338 4752 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 17:36:58 crc kubenswrapper[4752]: E0227 17:36:58.885467 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:58 crc kubenswrapper[4752]: E0227 17:36:58.985885 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:59 crc kubenswrapper[4752]: E0227 17:36:59.087364 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:59 crc kubenswrapper[4752]: E0227 17:36:59.188648 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:59 crc kubenswrapper[4752]: E0227 17:36:59.289503 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:59 crc kubenswrapper[4752]: E0227 17:36:59.390435 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:59 crc kubenswrapper[4752]: E0227 17:36:59.491482 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:59 crc kubenswrapper[4752]: E0227 17:36:59.592487 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:59 crc kubenswrapper[4752]: E0227 17:36:59.692838 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:59 crc kubenswrapper[4752]: E0227 17:36:59.793354 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:59 crc kubenswrapper[4752]: E0227 17:36:59.893789 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:36:59 crc kubenswrapper[4752]: E0227 17:36:59.993936 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:37:00 crc kubenswrapper[4752]: E0227 17:37:00.094263 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:37:00 crc kubenswrapper[4752]: E0227 17:37:00.194419 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:37:00 crc kubenswrapper[4752]: E0227 17:37:00.295216 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:37:00 crc kubenswrapper[4752]: E0227 17:37:00.396376 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:37:00 crc kubenswrapper[4752]: E0227 17:37:00.497135 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:37:00 crc kubenswrapper[4752]: E0227 17:37:00.597977 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:37:00 crc kubenswrapper[4752]: E0227 17:37:00.698840 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:37:00 crc kubenswrapper[4752]: E0227 17:37:00.799947 4752 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 27 17:37:00 crc kubenswrapper[4752]: E0227 17:37:00.900605 4752 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 27 17:37:01 crc kubenswrapper[4752]: E0227 17:37:01.002781 4752 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 27 17:37:01 crc kubenswrapper[4752]: E0227 17:37:01.012400 4752 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.273021 4752 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.924672 4752 apiserver.go:52] "Watching apiserver" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.935837 4752 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.936326 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-jlxqp","openshift-multus/multus-qpbx6","openshift-multus/network-metrics-daemon-jkjwj","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7","openshift-ovn-kubernetes/ovnkube-node-sfztq","openshift-machine-config-operator/machine-config-daemon-cm8wb","openshift-multus/multus-additional-cni-plugins-ntzss","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-image-registry/node-ca-5vlk6","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.936768 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.936795 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.936987 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.937118 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 17:37:05 crc kubenswrapper[4752]: E0227 17:37:05.937189 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.937348 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.937378 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:05 crc kubenswrapper[4752]: E0227 17:37:05.937651 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:37:05 crc kubenswrapper[4752]: E0227 17:37:05.937824 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.938026 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.938214 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.938377 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jlxqp" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.938488 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ntzss" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.938869 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:37:05 crc kubenswrapper[4752]: E0227 17:37:05.939490 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.939513 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5vlk6" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.939590 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qpbx6" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.939957 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.940121 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.941927 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.945767 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.946805 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.947235 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.949483 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.949655 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.949731 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.949937 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.950287 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.950299 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.950381 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.950761 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.950771 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.951034 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.951328 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.951376 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.951500 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.951615 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.951721 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.951747 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.952101 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.952356 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.952454 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.953016 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.953018 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.953078 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.953196 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.953268 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.953291 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.953328 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.953362 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.953301 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.953422 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.953623 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.953767 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.954027 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.979774 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 17:37:05 crc kubenswrapper[4752]: I0227 17:37:05.996196 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.006976 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 17:37:06 crc kubenswrapper[4752]: E0227 17:37:06.013328 4752 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.022538 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.033405 4752 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.034920 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jpsg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.046508 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.055873 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.065592 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jlxqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aad5923-f151-43de-a1a0-b8c6906c2d7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srhxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jlxqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.073784 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ce186c-640f-4ade-94e1-587c1440fe87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cm8wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.083170 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qpbx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f5dqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qpbx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.096468 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.096592 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.096620 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.097288 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.097336 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.097363 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.097400 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.097437 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.097472 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.097506 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.097538 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.097572 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.097606 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.097573 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.097638 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.097671 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.097704 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.097736 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.097769 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.097801 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.097831 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.097869 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.097906 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.097939 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.097976 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.098008 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.098038 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.098069 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.098100 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.098133 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.098212 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.098245 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.098281 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.098313 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.098344 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.098379 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.098411 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.098443 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.098482 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.098513 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.098546 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.098578 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.098611 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.098646 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.098678 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.098715 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.098760 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.098799 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.098840 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.098878 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.098915 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.098951 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.098983 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.099018 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.099051 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.099095 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.099135 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.099192 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.099226 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.099261 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.099305 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.099368 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.099290 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"690b0de6-1f38-4265-bfff-2077a349f89c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sfztq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.099406 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.099440 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.099475 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.099511 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.099545 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.099579 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.099614 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.099654 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.099807 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.099850 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.097666 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.097694 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.097873 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.098091 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.098159 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.100019 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.098779 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.098856 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.098869 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.098982 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.099287 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.099383 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.099666 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.099784 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.099891 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.100047 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.100177 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.100310 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.100804 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.100857 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.100894 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.101046 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.101310 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.101698 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.101721 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.102015 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.102030 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.102081 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.102255 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.102314 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.102453 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.099896 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.102753 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.102782 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.102810 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.102835 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.102861 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.102880 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.102898 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.102916 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.102936 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.102974 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.102991 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.103008 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.103027 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.103061 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.103078 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.103096 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.103113 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.103177 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.103196 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.103213 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.103230 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.103249 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.103265 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.103282 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.103287 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.103625 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.103340 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.103498 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.103592 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.103304 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.103743 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.103771 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.103789 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.103815 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.103832 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.103851 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.103869 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.103886 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.103905 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.103924 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.103939 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.103955 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.103957 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.103972 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.103992 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104011 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104030 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104051 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104070 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104086 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104104 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104120 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104158 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104177 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104200 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104219 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104235 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104256 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104276 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104301 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.103982 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104325 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104345 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104364 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104387 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104406 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104425 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104441 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104457 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104473 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104490 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104509 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104534 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104558 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104575 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104593 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104612 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104627 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104642 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104658 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104676 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104693 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104708 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104728 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104746 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104766 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104782 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104799 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104819 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104836 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104853 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104871 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104889 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104908 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104924 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104942 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104962 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104980 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104997 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105014 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105030 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105049 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105066 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105082 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105101 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105119 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105136 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105168 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105185 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105206 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105227 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105244 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105262 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105281 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105306 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105324 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105340 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105357 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105374 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105392 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105411 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105426 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105457 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105474 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105491 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105509 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105527 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105543 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105563 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105580 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105596 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105613 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105674 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhb87\" (UniqueName: \"kubernetes.io/projected/690b0de6-1f38-4265-bfff-2077a349f89c-kube-api-access-fhb87\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105698 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/53ce186c-640f-4ade-94e1-587c1440fe87-proxy-tls\") pod \"machine-config-daemon-cm8wb\" (UID: \"53ce186c-640f-4ade-94e1-587c1440fe87\") " pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105717 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3e5e2ad1-375b-4340-a583-e32742e736e6-cnibin\") pod \"multus-additional-cni-plugins-ntzss\" (UID: \"3e5e2ad1-375b-4340-a583-e32742e736e6\") " pod="openshift-multus/multus-additional-cni-plugins-ntzss" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105740 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105761 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-var-lib-openvswitch\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105780 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3e5e2ad1-375b-4340-a583-e32742e736e6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ntzss\" (UID: \"3e5e2ad1-375b-4340-a583-e32742e736e6\") " pod="openshift-multus/multus-additional-cni-plugins-ntzss" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105798 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105816 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-multus-cni-dir\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105831 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-host-run-k8s-cni-cncf-io\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105851 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105873 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105890 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-run-systemd\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105908 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105924 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105941 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-node-log\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105957 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccstl\" (UniqueName: \"kubernetes.io/projected/80ecbe44-7a3a-4cf1-9be4-b2f304a4fade-kube-api-access-ccstl\") pod \"node-ca-5vlk6\" (UID: \"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade\") " pod="openshift-image-registry/node-ca-5vlk6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105976 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105993 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-cni-bin\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.106012 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5dqm\" (UniqueName: \"kubernetes.io/projected/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-kube-api-access-f5dqm\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.106034 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.106051 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-systemd-units\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.106070 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-run-netns\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.106092 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-run-openvswitch\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.106115 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-cni-netd\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.106134 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/53ce186c-640f-4ade-94e1-587c1440fe87-rootfs\") pod \"machine-config-daemon-cm8wb\" (UID: \"53ce186c-640f-4ade-94e1-587c1440fe87\") " pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.106167 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jmrv\" (UniqueName: \"kubernetes.io/projected/937bbb35-a3c2-435c-86c5-1072f3a54595-kube-api-access-2jmrv\") pod \"network-metrics-daemon-jkjwj\" (UID: \"937bbb35-a3c2-435c-86c5-1072f3a54595\") " pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.106188 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-multus-socket-dir-parent\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.106217 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.106235 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3e5e2ad1-375b-4340-a583-e32742e736e6-os-release\") pod \"multus-additional-cni-plugins-ntzss\" (UID: \"3e5e2ad1-375b-4340-a583-e32742e736e6\") " pod="openshift-multus/multus-additional-cni-plugins-ntzss" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.106251 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-cni-binary-copy\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.106271 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-jpsg7\" (UID: \"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104110 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.107704 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104287 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104403 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104405 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104543 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104643 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.107886 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104859 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104865 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104877 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105126 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105343 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.108006 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105461 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105540 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105671 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105764 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.105920 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.106235 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: E0227 17:37:06.106305 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:37:06.606274586 +0000 UTC m=+126.513091467 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.106333 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.106426 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.106270 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.106567 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.107194 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.107362 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.107599 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.104837 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.108013 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.108185 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.108233 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.108300 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/690b0de6-1f38-4265-bfff-2077a349f89c-ovn-node-metrics-cert\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.108347 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7rjc\" (UniqueName: \"kubernetes.io/projected/3e5e2ad1-375b-4340-a583-e32742e736e6-kube-api-access-x7rjc\") pod \"multus-additional-cni-plugins-ntzss\" (UID: \"3e5e2ad1-375b-4340-a583-e32742e736e6\") " pod="openshift-multus/multus-additional-cni-plugins-ntzss" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.108369 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/937bbb35-a3c2-435c-86c5-1072f3a54595-metrics-certs\") pod \"network-metrics-daemon-jkjwj\" (UID: \"937bbb35-a3c2-435c-86c5-1072f3a54595\") " pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.108387 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-host-var-lib-cni-bin\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.108428 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3e5e2ad1-375b-4340-a583-e32742e736e6-system-cni-dir\") pod \"multus-additional-cni-plugins-ntzss\" (UID: \"3e5e2ad1-375b-4340-a583-e32742e736e6\") " pod="openshift-multus/multus-additional-cni-plugins-ntzss" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.108448 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srhxp\" (UniqueName: \"kubernetes.io/projected/1aad5923-f151-43de-a1a0-b8c6906c2d7e-kube-api-access-srhxp\") pod \"node-resolver-jlxqp\" (UID: \"1aad5923-f151-43de-a1a0-b8c6906c2d7e\") " pod="openshift-dns/node-resolver-jlxqp" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.108466 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-hostroot\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.108484 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-slash\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.108522 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.108543 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-kubelet\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.108558 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-log-socket\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.108599 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zg72r\" (UniqueName: \"kubernetes.io/projected/53ce186c-640f-4ade-94e1-587c1440fe87-kube-api-access-zg72r\") pod \"machine-config-daemon-cm8wb\" (UID: \"53ce186c-640f-4ade-94e1-587c1440fe87\") " pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.108623 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-cnibin\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.108635 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.108699 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.108844 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.108859 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.109088 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.109359 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.109685 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.109954 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.110097 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.110575 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.110636 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.110655 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.111027 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.111103 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.111217 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.111649 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.111777 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.111934 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.108638 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-multus-conf-dir\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.112376 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-multus-daemon-config\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.112430 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.112491 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.112903 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-host-run-multus-certs\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.112950 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.112983 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.112961 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.113062 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-os-release\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.113109 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.113247 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.113119 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-run-ovn\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.113636 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/690b0de6-1f38-4265-bfff-2077a349f89c-ovnkube-config\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.113674 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/690b0de6-1f38-4265-bfff-2077a349f89c-env-overrides\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.113697 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3e5e2ad1-375b-4340-a583-e32742e736e6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ntzss\" (UID: \"3e5e2ad1-375b-4340-a583-e32742e736e6\") " pod="openshift-multus/multus-additional-cni-plugins-ntzss" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.113762 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1aad5923-f151-43de-a1a0-b8c6906c2d7e-hosts-file\") pod \"node-resolver-jlxqp\" (UID: \"1aad5923-f151-43de-a1a0-b8c6906c2d7e\") " pod="openshift-dns/node-resolver-jlxqp" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.113836 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.113863 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-host-run-netns\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.113884 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-host-var-lib-kubelet\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.113918 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/80ecbe44-7a3a-4cf1-9be4-b2f304a4fade-host\") pod \"node-ca-5vlk6\" (UID: \"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade\") " pod="openshift-image-registry/node-ca-5vlk6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.113559 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.114249 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.114437 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.114626 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.114718 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.114761 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.114856 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd-env-overrides\") pod \"ovnkube-control-plane-749d76644c-jpsg7\" (UID: \"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.114890 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 17:37:06 crc kubenswrapper[4752]: E0227 17:37:06.114956 4752 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.115031 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-etc-openvswitch\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.115113 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/690b0de6-1f38-4265-bfff-2077a349f89c-ovnkube-script-lib\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.115180 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3e5e2ad1-375b-4340-a583-e32742e736e6-cni-binary-copy\") pod \"multus-additional-cni-plugins-ntzss\" (UID: \"3e5e2ad1-375b-4340-a583-e32742e736e6\") " pod="openshift-multus/multus-additional-cni-plugins-ntzss" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.115238 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.115280 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-jpsg7\" (UID: \"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.115360 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.115400 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-run-ovn-kubernetes\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.113789 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.113338 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.113326 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.113418 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.113387 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.113534 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: E0227 17:37:06.113638 4752 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.115622 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: E0227 17:37:06.115778 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 17:37:06.615752774 +0000 UTC m=+126.522569655 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.115444 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/53ce186c-640f-4ade-94e1-587c1440fe87-mcd-auth-proxy-config\") pod \"machine-config-daemon-cm8wb\" (UID: \"53ce186c-640f-4ade-94e1-587c1440fe87\") " pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.115993 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.116047 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.116358 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-system-cni-dir\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.116404 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-host-var-lib-cni-multus\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.116430 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-etc-kubernetes\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.116452 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/80ecbe44-7a3a-4cf1-9be4-b2f304a4fade-serviceca\") pod \"node-ca-5vlk6\" (UID: \"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade\") " pod="openshift-image-registry/node-ca-5vlk6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.116497 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj5v8\" (UniqueName: \"kubernetes.io/projected/28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd-kube-api-access-fj5v8\") pod \"ovnkube-control-plane-749d76644c-jpsg7\" (UID: \"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.116476 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: E0227 17:37:06.116645 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 17:37:06.616626626 +0000 UTC m=+126.523443477 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.116892 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.116896 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.116965 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.117000 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.117075 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.117428 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.117449 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.117640 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.117456 4752 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118077 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118103 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118209 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118231 4752 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118246 4752 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118256 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118265 4752 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118274 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118283 4752 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118293 4752 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118303 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118313 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118313 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118324 4752 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118374 4752 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118404 4752 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118427 4752 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118453 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118473 4752 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118548 4752 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118557 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118569 4752 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118589 4752 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118608 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118628 4752 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118649 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118669 4752 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118690 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118710 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118729 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118736 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118749 4752 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118750 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118769 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118789 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118808 4752 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118827 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118846 4752 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118866 4752 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118886 4752 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118908 4752 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118927 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118945 4752 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118966 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.118985 4752 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.119006 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.119027 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.119049 4752 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.119022 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.119066 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.119118 4752 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.119137 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.119171 4752 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.119186 4752 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.119201 4752 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.119219 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.119232 4752 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.119245 4752 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.119259 4752 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.119254 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.119280 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.119703 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.119744 4752 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.119768 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.119792 4752 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.119827 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.119866 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.119891 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.119875 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.119939 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.119952 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.119964 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.119977 4752 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.119990 4752 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.120001 4752 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.120016 4752 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.120028 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.120041 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.120052 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.120062 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.120073 4752 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.120127 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.120169 4752 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.120180 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.120190 4752 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.120200 4752 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.120210 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.120220 4752 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.120232 4752 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.120243 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.120254 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.120265 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.120277 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.120287 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.120297 4752 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.120307 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.120021 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.120233 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.120343 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.120524 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.120596 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.121052 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.121234 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.121014 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.121458 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.121622 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.121323 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.122323 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.122616 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.122689 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.122752 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.122466 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.122907 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.123275 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.123470 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.123506 4752 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.123419 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.123641 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.123673 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.123723 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.123753 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.124277 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.124356 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.124718 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.124578 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.124935 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.124934 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.125227 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.125846 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.126285 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntzss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e5e2ad1-375b-4340-a583-e32742e736e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntzss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 17:37:06 crc kubenswrapper[4752]: E0227 17:37:06.127950 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 17:37:06 crc kubenswrapper[4752]: E0227 17:37:06.127985 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 17:37:06 crc kubenswrapper[4752]: E0227 17:37:06.128005 4752 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 17:37:06 crc kubenswrapper[4752]: E0227 17:37:06.128079 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-27 17:37:06.628060143 +0000 UTC m=+126.534877034 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.129394 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.131098 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.134235 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.134561 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.134809 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.134805 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.134887 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.135207 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.135658 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.135958 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.136056 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.136569 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.136988 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.137305 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.137977 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 17:37:06 crc kubenswrapper[4752]: E0227 17:37:06.138481 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 17:37:06 crc kubenswrapper[4752]: E0227 17:37:06.138806 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 17:37:06 crc kubenswrapper[4752]: E0227 17:37:06.138820 4752 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 17:37:06 crc kubenswrapper[4752]: E0227 17:37:06.138864 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-27 17:37:06.638849824 +0000 UTC m=+126.545666685 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.138571 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.139519 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.141229 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.143894 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.144667 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.144777 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.144980 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.146491 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.146528 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.146554 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.146751 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.146978 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.147096 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.147406 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.147670 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.147724 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.147741 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.147919 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.150320 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.150369 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.150715 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.150753 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.151819 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.152096 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.152291 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.158944 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.161315 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.163447 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.171220 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.172361 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vlk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccstl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vlk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.177574 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.182472 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.190429 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jkjwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"937bbb35-a3c2-435c-86c5-1072f3a54595\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jkjwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.221304 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-os-release\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.221535 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1aad5923-f151-43de-a1a0-b8c6906c2d7e-hosts-file\") pod \"node-resolver-jlxqp\" (UID: \"1aad5923-f151-43de-a1a0-b8c6906c2d7e\") " pod="openshift-dns/node-resolver-jlxqp" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.221464 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-os-release\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.221656 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-run-ovn\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.221776 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/690b0de6-1f38-4265-bfff-2077a349f89c-ovnkube-config\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.221852 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/690b0de6-1f38-4265-bfff-2077a349f89c-env-overrides\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.221883 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1aad5923-f151-43de-a1a0-b8c6906c2d7e-hosts-file\") pod \"node-resolver-jlxqp\" (UID: \"1aad5923-f151-43de-a1a0-b8c6906c2d7e\") " pod="openshift-dns/node-resolver-jlxqp" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.221929 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3e5e2ad1-375b-4340-a583-e32742e736e6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ntzss\" (UID: \"3e5e2ad1-375b-4340-a583-e32742e736e6\") " pod="openshift-multus/multus-additional-cni-plugins-ntzss" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.221722 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-run-ovn\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.222003 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-host-run-netns\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.222091 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-host-run-netns\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.222129 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-host-var-lib-kubelet\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.222199 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-host-var-lib-kubelet\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.222270 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/80ecbe44-7a3a-4cf1-9be4-b2f304a4fade-host\") pod \"node-ca-5vlk6\" (UID: \"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade\") " pod="openshift-image-registry/node-ca-5vlk6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.222353 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd-env-overrides\") pod \"ovnkube-control-plane-749d76644c-jpsg7\" (UID: \"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.222389 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-etc-openvswitch\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.222431 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/690b0de6-1f38-4265-bfff-2077a349f89c-ovnkube-script-lib\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.222462 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3e5e2ad1-375b-4340-a583-e32742e736e6-cni-binary-copy\") pod \"multus-additional-cni-plugins-ntzss\" (UID: \"3e5e2ad1-375b-4340-a583-e32742e736e6\") " pod="openshift-multus/multus-additional-cni-plugins-ntzss" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.222531 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-jpsg7\" (UID: \"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.222561 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-host-var-lib-cni-multus\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.222591 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-etc-kubernetes\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.222623 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-run-ovn-kubernetes\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.222653 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/53ce186c-640f-4ade-94e1-587c1440fe87-mcd-auth-proxy-config\") pod \"machine-config-daemon-cm8wb\" (UID: \"53ce186c-640f-4ade-94e1-587c1440fe87\") " pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.222690 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-system-cni-dir\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.222718 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/80ecbe44-7a3a-4cf1-9be4-b2f304a4fade-serviceca\") pod \"node-ca-5vlk6\" (UID: \"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade\") " pod="openshift-image-registry/node-ca-5vlk6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.222744 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/690b0de6-1f38-4265-bfff-2077a349f89c-ovnkube-config\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.222778 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3e5e2ad1-375b-4340-a583-e32742e736e6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ntzss\" (UID: \"3e5e2ad1-375b-4340-a583-e32742e736e6\") " pod="openshift-multus/multus-additional-cni-plugins-ntzss" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.222748 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj5v8\" (UniqueName: \"kubernetes.io/projected/28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd-kube-api-access-fj5v8\") pod \"ovnkube-control-plane-749d76644c-jpsg7\" (UID: \"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.222834 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-host-var-lib-cni-multus\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.222840 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhb87\" (UniqueName: \"kubernetes.io/projected/690b0de6-1f38-4265-bfff-2077a349f89c-kube-api-access-fhb87\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.222863 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-etc-openvswitch\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.222873 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/53ce186c-640f-4ade-94e1-587c1440fe87-proxy-tls\") pod \"machine-config-daemon-cm8wb\" (UID: \"53ce186c-640f-4ade-94e1-587c1440fe87\") " pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.222896 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3e5e2ad1-375b-4340-a583-e32742e736e6-cnibin\") pod \"multus-additional-cni-plugins-ntzss\" (UID: \"3e5e2ad1-375b-4340-a583-e32742e736e6\") " pod="openshift-multus/multus-additional-cni-plugins-ntzss" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.222934 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-var-lib-openvswitch\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.222994 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3e5e2ad1-375b-4340-a583-e32742e736e6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ntzss\" (UID: \"3e5e2ad1-375b-4340-a583-e32742e736e6\") " pod="openshift-multus/multus-additional-cni-plugins-ntzss" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.223023 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.223045 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-multus-cni-dir\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.223065 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-host-run-k8s-cni-cncf-io\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.223102 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-run-systemd\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.223123 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.223174 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-etc-kubernetes\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.223180 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-node-log\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.223231 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3e5e2ad1-375b-4340-a583-e32742e736e6-cnibin\") pod \"multus-additional-cni-plugins-ntzss\" (UID: \"3e5e2ad1-375b-4340-a583-e32742e736e6\") " pod="openshift-multus/multus-additional-cni-plugins-ntzss" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.223244 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccstl\" (UniqueName: \"kubernetes.io/projected/80ecbe44-7a3a-4cf1-9be4-b2f304a4fade-kube-api-access-ccstl\") pod \"node-ca-5vlk6\" (UID: \"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade\") " pod="openshift-image-registry/node-ca-5vlk6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.223298 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-var-lib-openvswitch\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.222923 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/690b0de6-1f38-4265-bfff-2077a349f89c-env-overrides\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.223321 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-cni-bin\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.223383 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-system-cni-dir\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.223380 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5dqm\" (UniqueName: \"kubernetes.io/projected/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-kube-api-access-f5dqm\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.223432 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-cni-netd\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.223453 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/53ce186c-640f-4ade-94e1-587c1440fe87-rootfs\") pod \"machine-config-daemon-cm8wb\" (UID: \"53ce186c-640f-4ade-94e1-587c1440fe87\") " pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.223475 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-systemd-units\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.223493 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-run-netns\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.223510 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-run-openvswitch\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.223531 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jmrv\" (UniqueName: \"kubernetes.io/projected/937bbb35-a3c2-435c-86c5-1072f3a54595-kube-api-access-2jmrv\") pod \"network-metrics-daemon-jkjwj\" (UID: \"937bbb35-a3c2-435c-86c5-1072f3a54595\") " pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.223546 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-multus-socket-dir-parent\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.223576 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3e5e2ad1-375b-4340-a583-e32742e736e6-os-release\") pod \"multus-additional-cni-plugins-ntzss\" (UID: \"3e5e2ad1-375b-4340-a583-e32742e736e6\") " pod="openshift-multus/multus-additional-cni-plugins-ntzss" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.223592 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-cni-binary-copy\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.223610 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-jpsg7\" (UID: \"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.223629 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/690b0de6-1f38-4265-bfff-2077a349f89c-ovn-node-metrics-cert\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.223645 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7rjc\" (UniqueName: \"kubernetes.io/projected/3e5e2ad1-375b-4340-a583-e32742e736e6-kube-api-access-x7rjc\") pod \"multus-additional-cni-plugins-ntzss\" (UID: \"3e5e2ad1-375b-4340-a583-e32742e736e6\") " pod="openshift-multus/multus-additional-cni-plugins-ntzss" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.223664 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/937bbb35-a3c2-435c-86c5-1072f3a54595-metrics-certs\") pod \"network-metrics-daemon-jkjwj\" (UID: \"937bbb35-a3c2-435c-86c5-1072f3a54595\") " pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.223680 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-host-var-lib-cni-bin\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.223797 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd-env-overrides\") pod \"ovnkube-control-plane-749d76644c-jpsg7\" (UID: \"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.223925 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-run-ovn-kubernetes\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.223954 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-host-run-k8s-cni-cncf-io\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.224026 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-run-systemd\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.224063 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.224087 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3e5e2ad1-375b-4340-a583-e32742e736e6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ntzss\" (UID: \"3e5e2ad1-375b-4340-a583-e32742e736e6\") " pod="openshift-multus/multus-additional-cni-plugins-ntzss" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.224174 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.224216 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-cni-bin\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.224220 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3e5e2ad1-375b-4340-a583-e32742e736e6-system-cni-dir\") pod \"multus-additional-cni-plugins-ntzss\" (UID: \"3e5e2ad1-375b-4340-a583-e32742e736e6\") " pod="openshift-multus/multus-additional-cni-plugins-ntzss" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.224248 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srhxp\" (UniqueName: \"kubernetes.io/projected/1aad5923-f151-43de-a1a0-b8c6906c2d7e-kube-api-access-srhxp\") pod \"node-resolver-jlxqp\" (UID: \"1aad5923-f151-43de-a1a0-b8c6906c2d7e\") " pod="openshift-dns/node-resolver-jlxqp" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.224263 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-hostroot\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.224277 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-slash\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.224292 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.224308 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-multus-daemon-config\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.224309 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-jpsg7\" (UID: \"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.224324 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-host-run-multus-certs\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.224350 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-host-run-multus-certs\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.224358 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-kubelet\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.224380 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-log-socket\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.224399 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3e5e2ad1-375b-4340-a583-e32742e736e6-os-release\") pod \"multus-additional-cni-plugins-ntzss\" (UID: \"3e5e2ad1-375b-4340-a583-e32742e736e6\") " pod="openshift-multus/multus-additional-cni-plugins-ntzss" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.224402 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-multus-socket-dir-parent\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.224398 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zg72r\" (UniqueName: \"kubernetes.io/projected/53ce186c-640f-4ade-94e1-587c1440fe87-kube-api-access-zg72r\") pod \"machine-config-daemon-cm8wb\" (UID: \"53ce186c-640f-4ade-94e1-587c1440fe87\") " pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.223209 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-node-log\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.224427 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-cnibin\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.224432 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-cni-netd\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.224441 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-multus-conf-dir\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.223420 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/80ecbe44-7a3a-4cf1-9be4-b2f304a4fade-host\") pod \"node-ca-5vlk6\" (UID: \"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade\") " pod="openshift-image-registry/node-ca-5vlk6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.224540 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-kubelet\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.224564 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-log-socket\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.224568 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-multus-cni-dir\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.224589 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-systemd-units\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.224614 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.224654 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-hostroot\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.224709 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-run-openvswitch\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.224720 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-multus-conf-dir\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.224861 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-cnibin\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.224902 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-run-netns\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.224935 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-host-var-lib-cni-bin\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.224956 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/53ce186c-640f-4ade-94e1-587c1440fe87-mcd-auth-proxy-config\") pod \"machine-config-daemon-cm8wb\" (UID: \"53ce186c-640f-4ade-94e1-587c1440fe87\") " pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.224992 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/53ce186c-640f-4ade-94e1-587c1440fe87-rootfs\") pod \"machine-config-daemon-cm8wb\" (UID: \"53ce186c-640f-4ade-94e1-587c1440fe87\") " pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.224969 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3e5e2ad1-375b-4340-a583-e32742e736e6-system-cni-dir\") pod \"multus-additional-cni-plugins-ntzss\" (UID: \"3e5e2ad1-375b-4340-a583-e32742e736e6\") " pod="openshift-multus/multus-additional-cni-plugins-ntzss" Feb 27 17:37:06 crc kubenswrapper[4752]: E0227 17:37:06.225066 4752 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 17:37:06 crc kubenswrapper[4752]: E0227 17:37:06.225131 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/937bbb35-a3c2-435c-86c5-1072f3a54595-metrics-certs podName:937bbb35-a3c2-435c-86c5-1072f3a54595 nodeName:}" failed. No retries permitted until 2026-02-27 17:37:06.725108599 +0000 UTC m=+126.631925490 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/937bbb35-a3c2-435c-86c5-1072f3a54595-metrics-certs") pod "network-metrics-daemon-jkjwj" (UID: "937bbb35-a3c2-435c-86c5-1072f3a54595") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225187 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-multus-daemon-config\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225222 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-slash\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225403 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225413 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-cni-binary-copy\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225434 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225402 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/80ecbe44-7a3a-4cf1-9be4-b2f304a4fade-serviceca\") pod \"node-ca-5vlk6\" (UID: \"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade\") " pod="openshift-image-registry/node-ca-5vlk6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225530 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225556 4752 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225571 4752 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225584 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225597 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225604 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/690b0de6-1f38-4265-bfff-2077a349f89c-ovnkube-script-lib\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225610 4752 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225649 4752 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225661 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225670 4752 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225679 4752 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225688 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225706 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225715 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225724 4752 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225733 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225742 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225751 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225759 4752 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225768 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225776 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225785 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225796 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225805 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225814 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225822 4752 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225829 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225838 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225850 4752 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225860 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225871 4752 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225882 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225893 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225905 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225916 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225928 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225936 4752 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225945 4752 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225953 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225961 4752 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225970 4752 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225978 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225986 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.225994 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226001 4752 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226009 4752 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226017 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226027 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226043 4752 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226056 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226067 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226082 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226090 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226098 4752 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226105 4752 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226126 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226136 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226163 4752 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226174 4752 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226185 4752 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226196 4752 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226204 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226213 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226223 4752 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226231 4752 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226240 4752 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226248 4752 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226256 4752 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226265 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226272 4752 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226280 4752 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226289 4752 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226296 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226304 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226312 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226320 4752 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226328 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226336 4752 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226351 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226363 4752 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226371 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226379 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226387 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226398 4752 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226417 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226426 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226433 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226441 4752 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226449 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226457 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226468 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226477 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226487 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226495 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226503 4752 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226511 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226518 4752 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226526 4752 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226534 4752 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226541 4752 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226549 4752 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226556 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226566 4752 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.226574 4752 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.227107 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3e5e2ad1-375b-4340-a583-e32742e736e6-cni-binary-copy\") pod \"multus-additional-cni-plugins-ntzss\" (UID: \"3e5e2ad1-375b-4340-a583-e32742e736e6\") " pod="openshift-multus/multus-additional-cni-plugins-ntzss" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.235114 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/53ce186c-640f-4ade-94e1-587c1440fe87-proxy-tls\") pod \"machine-config-daemon-cm8wb\" (UID: \"53ce186c-640f-4ade-94e1-587c1440fe87\") " pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.237926 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/690b0de6-1f38-4265-bfff-2077a349f89c-ovn-node-metrics-cert\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.239516 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-jpsg7\" (UID: \"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.242543 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7rjc\" (UniqueName: \"kubernetes.io/projected/3e5e2ad1-375b-4340-a583-e32742e736e6-kube-api-access-x7rjc\") pod \"multus-additional-cni-plugins-ntzss\" (UID: \"3e5e2ad1-375b-4340-a583-e32742e736e6\") " pod="openshift-multus/multus-additional-cni-plugins-ntzss" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.243228 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj5v8\" (UniqueName: \"kubernetes.io/projected/28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd-kube-api-access-fj5v8\") pod \"ovnkube-control-plane-749d76644c-jpsg7\" (UID: \"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.243779 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zg72r\" (UniqueName: \"kubernetes.io/projected/53ce186c-640f-4ade-94e1-587c1440fe87-kube-api-access-zg72r\") pod \"machine-config-daemon-cm8wb\" (UID: \"53ce186c-640f-4ade-94e1-587c1440fe87\") " pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.244933 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhb87\" (UniqueName: \"kubernetes.io/projected/690b0de6-1f38-4265-bfff-2077a349f89c-kube-api-access-fhb87\") pod \"ovnkube-node-sfztq\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.249026 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srhxp\" (UniqueName: \"kubernetes.io/projected/1aad5923-f151-43de-a1a0-b8c6906c2d7e-kube-api-access-srhxp\") pod \"node-resolver-jlxqp\" (UID: \"1aad5923-f151-43de-a1a0-b8c6906c2d7e\") " pod="openshift-dns/node-resolver-jlxqp" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.250933 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jmrv\" (UniqueName: \"kubernetes.io/projected/937bbb35-a3c2-435c-86c5-1072f3a54595-kube-api-access-2jmrv\") pod \"network-metrics-daemon-jkjwj\" (UID: \"937bbb35-a3c2-435c-86c5-1072f3a54595\") " pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.252096 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5dqm\" (UniqueName: \"kubernetes.io/projected/098f70a1-c2c2-44ce-9c0c-356e7eea2da9-kube-api-access-f5dqm\") pod \"multus-qpbx6\" (UID: \"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\") " pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.252362 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccstl\" (UniqueName: \"kubernetes.io/projected/80ecbe44-7a3a-4cf1-9be4-b2f304a4fade-kube-api-access-ccstl\") pod \"node-ca-5vlk6\" (UID: \"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade\") " pod="openshift-image-registry/node-ca-5vlk6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.256066 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.269482 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.279763 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.293310 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.303504 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jlxqp" Feb 27 17:37:06 crc kubenswrapper[4752]: W0227 17:37:06.303493 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-a7fe2d45694572d6a01cdd9b6f1ab3a0c80b3074822779f47ad5eb43f75d6ba9 WatchSource:0}: Error finding container a7fe2d45694572d6a01cdd9b6f1ab3a0c80b3074822779f47ad5eb43f75d6ba9: Status 404 returned error can't find the container with id a7fe2d45694572d6a01cdd9b6f1ab3a0c80b3074822779f47ad5eb43f75d6ba9 Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.312288 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ntzss" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.322497 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.335805 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5vlk6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.345400 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qpbx6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.354419 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" Feb 27 17:37:06 crc kubenswrapper[4752]: W0227 17:37:06.384062 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53ce186c_640f_4ade_94e1_587c1440fe87.slice/crio-ac0c691bd7be48409cdec4852bdd2af1070489fbae260373a244f38ef9e52da3 WatchSource:0}: Error finding container ac0c691bd7be48409cdec4852bdd2af1070489fbae260373a244f38ef9e52da3: Status 404 returned error can't find the container with id ac0c691bd7be48409cdec4852bdd2af1070489fbae260373a244f38ef9e52da3 Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.425411 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" event={"ID":"690b0de6-1f38-4265-bfff-2077a349f89c","Type":"ContainerStarted","Data":"90f139d1a00ad4a16e4501b5ab1653634bf1e35e8f3070d69bd8d242bee3d228"} Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.426877 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"575bb0d04d6367e1531a16d9b50ca33c95412aeaad58ef52ad70a7bc5521000c"} Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.428595 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" event={"ID":"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd","Type":"ContainerStarted","Data":"4dd0d81926216f54815fb577784cf10f5a49f40d637c8790332d62b6d102747e"} Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.429552 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5vlk6" event={"ID":"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade","Type":"ContainerStarted","Data":"da6fc3596e8faa759371f1ab49a222b8a3d0edab16f009017e46d7f18d0a9636"} Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.430521 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qpbx6" event={"ID":"098f70a1-c2c2-44ce-9c0c-356e7eea2da9","Type":"ContainerStarted","Data":"55318355bf879627f8ae15709f39c802da915a82eef46fa9269ac653f7e8e8e9"} Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.432034 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jlxqp" event={"ID":"1aad5923-f151-43de-a1a0-b8c6906c2d7e","Type":"ContainerStarted","Data":"e7c25049108084150e55f879a173d4ddadd6bfb4e85d7dbdb8783a7dff660c67"} Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.435902 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ntzss" event={"ID":"3e5e2ad1-375b-4340-a583-e32742e736e6","Type":"ContainerStarted","Data":"bf1ce73603ddf12341f54f664797afa9fddcf02e14173448b4ec1a690ff06c56"} Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.438508 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"f4a45d4cb97a37184704dc7807c57940b14d933a70a1f7e2ec292da30bf94e60"} Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.439623 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" event={"ID":"53ce186c-640f-4ade-94e1-587c1440fe87","Type":"ContainerStarted","Data":"ac0c691bd7be48409cdec4852bdd2af1070489fbae260373a244f38ef9e52da3"} Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.440998 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"a7fe2d45694572d6a01cdd9b6f1ab3a0c80b3074822779f47ad5eb43f75d6ba9"} Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.631065 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.631231 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:06 crc kubenswrapper[4752]: E0227 17:37:06.631277 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:37:07.631250244 +0000 UTC m=+127.538067105 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:37:06 crc kubenswrapper[4752]: E0227 17:37:06.631356 4752 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 17:37:06 crc kubenswrapper[4752]: E0227 17:37:06.631412 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 17:37:07.631399858 +0000 UTC m=+127.538216709 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.631427 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.631511 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:06 crc kubenswrapper[4752]: E0227 17:37:06.631584 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 17:37:06 crc kubenswrapper[4752]: E0227 17:37:06.631598 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 17:37:06 crc kubenswrapper[4752]: E0227 17:37:06.631608 4752 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 17:37:06 crc kubenswrapper[4752]: E0227 17:37:06.631613 4752 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 17:37:06 crc kubenswrapper[4752]: E0227 17:37:06.631636 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-27 17:37:07.631626554 +0000 UTC m=+127.538443405 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 17:37:06 crc kubenswrapper[4752]: E0227 17:37:06.631656 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 17:37:07.631646814 +0000 UTC m=+127.538463675 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.732810 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.732887 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/937bbb35-a3c2-435c-86c5-1072f3a54595-metrics-certs\") pod \"network-metrics-daemon-jkjwj\" (UID: \"937bbb35-a3c2-435c-86c5-1072f3a54595\") " pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:37:06 crc kubenswrapper[4752]: E0227 17:37:06.733017 4752 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 17:37:06 crc kubenswrapper[4752]: E0227 17:37:06.733050 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 17:37:06 crc kubenswrapper[4752]: E0227 17:37:06.733093 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 17:37:06 crc kubenswrapper[4752]: E0227 17:37:06.733108 4752 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 17:37:06 crc kubenswrapper[4752]: E0227 17:37:06.733073 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/937bbb35-a3c2-435c-86c5-1072f3a54595-metrics-certs podName:937bbb35-a3c2-435c-86c5-1072f3a54595 nodeName:}" failed. No retries permitted until 2026-02-27 17:37:07.73305647 +0000 UTC m=+127.639873331 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/937bbb35-a3c2-435c-86c5-1072f3a54595-metrics-certs") pod "network-metrics-daemon-jkjwj" (UID: "937bbb35-a3c2-435c-86c5-1072f3a54595") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 17:37:06 crc kubenswrapper[4752]: E0227 17:37:06.733209 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-27 17:37:07.733188363 +0000 UTC m=+127.640005294 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.913124 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.913810 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.914897 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.915858 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.916807 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.917533 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.918424 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.919287 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.920352 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.921119 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.921875 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.925844 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.927609 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.929912 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.931104 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.933057 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.934396 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.935364 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.937445 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.938756 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.939889 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.942213 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.943209 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.945513 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.946686 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.948327 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.950299 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.951778 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.953434 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.954674 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.956949 4752 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.957223 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.961569 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.963085 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.964369 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.968303 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.971595 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.973076 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.975487 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.976924 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.978876 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.980284 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.982604 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.983604 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.984876 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.985663 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.987254 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.988617 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.989938 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.990753 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.991462 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.992782 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.993620 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 27 17:37:06 crc kubenswrapper[4752]: I0227 17:37:06.994847 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.446590 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qpbx6" event={"ID":"098f70a1-c2c2-44ce-9c0c-356e7eea2da9","Type":"ContainerStarted","Data":"ca0c39841636b80e8872853f9f5695cffa06ce37002e2eb03206a41a5b7be1a1"} Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.449595 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jlxqp" event={"ID":"1aad5923-f151-43de-a1a0-b8c6906c2d7e","Type":"ContainerStarted","Data":"49f03929dd1e6f1510d46093e8ae9214eefc7c04d1db680cd69fde52c9344298"} Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.453017 4752 generic.go:334] "Generic (PLEG): container finished" podID="690b0de6-1f38-4265-bfff-2077a349f89c" containerID="00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec" exitCode=0 Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.453209 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" event={"ID":"690b0de6-1f38-4265-bfff-2077a349f89c","Type":"ContainerDied","Data":"00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec"} Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.457218 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5vlk6" event={"ID":"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade","Type":"ContainerStarted","Data":"53dc23b1ea3fe03171bff4757ee01e66418bad442ac3a3489002239e543dbd6d"} Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.461393 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c2472139744ba3e03c75f0553bd60e1cb207dc09b8476cb53d54ba4f4490080e"} Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.461757 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"272ec71ef97366504dfbfefb8cb3f687f41639740074ce3199b54d15ee328e4a"} Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.478068 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"345b6aa5640e3a483968ee51953b07e13323f9a6ffed421509dc7a806d526100"} Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.480252 4752 generic.go:334] "Generic (PLEG): container finished" podID="3e5e2ad1-375b-4340-a583-e32742e736e6" containerID="0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644" exitCode=0 Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.480335 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ntzss" event={"ID":"3e5e2ad1-375b-4340-a583-e32742e736e6","Type":"ContainerDied","Data":"0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644"} Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.486026 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" event={"ID":"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd","Type":"ContainerStarted","Data":"a98009a8a13b1dfd8f9ceb1385dc6e4c9e209083771716793c0822fe287e5dfd"} Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.486074 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" event={"ID":"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd","Type":"ContainerStarted","Data":"59d3acd38b3425ce4a5d9befc6cc4e74b0192874e4bc99516544c8502e71be3f"} Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.492392 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" event={"ID":"53ce186c-640f-4ade-94e1-587c1440fe87","Type":"ContainerStarted","Data":"75b5f36c8fa728c5c34b846e1b603069f07726a5213fdf0bf0ae8b6b12b676b9"} Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.492436 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" event={"ID":"53ce186c-640f-4ade-94e1-587c1440fe87","Type":"ContainerStarted","Data":"4a78e5a0164b37185d7cf25f07c0ea5a1fd6df0d23a1ff173bf085817d2f672f"} Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.495743 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:07Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.529455 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:07Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.549711 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jpsg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:07Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.580069 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"690b0de6-1f38-4265-bfff-2077a349f89c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sfztq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:07Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.594427 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jlxqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aad5923-f151-43de-a1a0-b8c6906c2d7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srhxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jlxqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:07Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.608938 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ce186c-640f-4ade-94e1-587c1440fe87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cm8wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:07Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.633958 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qpbx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca0c39841636b80e8872853f9f5695cffa06ce37002e2eb03206a41a5b7be1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f5dqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qpbx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:07Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.647726 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.647798 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.647846 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:07 crc kubenswrapper[4752]: E0227 17:37:07.647872 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:37:09.647851924 +0000 UTC m=+129.554668775 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.647899 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:07 crc kubenswrapper[4752]: E0227 17:37:07.647918 4752 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 17:37:07 crc kubenswrapper[4752]: E0227 17:37:07.647950 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 17:37:09.647942947 +0000 UTC m=+129.554759798 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 17:37:07 crc kubenswrapper[4752]: E0227 17:37:07.647989 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 17:37:07 crc kubenswrapper[4752]: E0227 17:37:07.648020 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 17:37:07 crc kubenswrapper[4752]: E0227 17:37:07.647994 4752 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 17:37:07 crc kubenswrapper[4752]: E0227 17:37:07.648069 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 17:37:09.64805875 +0000 UTC m=+129.554875601 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 17:37:07 crc kubenswrapper[4752]: E0227 17:37:07.648033 4752 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 17:37:07 crc kubenswrapper[4752]: E0227 17:37:07.648185 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-27 17:37:09.648139552 +0000 UTC m=+129.554956413 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.650316 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:07Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.664563 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:07Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.678001 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:07Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.697302 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntzss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e5e2ad1-375b-4340-a583-e32742e736e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntzss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:07Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.714402 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:07Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.726295 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jkjwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"937bbb35-a3c2-435c-86c5-1072f3a54595\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jkjwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:07Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.737107 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vlk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccstl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vlk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:07Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.748443 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.748517 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/937bbb35-a3c2-435c-86c5-1072f3a54595-metrics-certs\") pod \"network-metrics-daemon-jkjwj\" (UID: \"937bbb35-a3c2-435c-86c5-1072f3a54595\") " pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:37:07 crc kubenswrapper[4752]: E0227 17:37:07.748604 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 17:37:07 crc kubenswrapper[4752]: E0227 17:37:07.748624 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 17:37:07 crc kubenswrapper[4752]: E0227 17:37:07.748625 4752 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 17:37:07 crc kubenswrapper[4752]: E0227 17:37:07.748637 4752 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 17:37:07 crc kubenswrapper[4752]: E0227 17:37:07.748689 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-27 17:37:09.748673405 +0000 UTC m=+129.655490266 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 17:37:07 crc kubenswrapper[4752]: E0227 17:37:07.748708 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/937bbb35-a3c2-435c-86c5-1072f3a54595-metrics-certs podName:937bbb35-a3c2-435c-86c5-1072f3a54595 nodeName:}" failed. No retries permitted until 2026-02-27 17:37:09.748702276 +0000 UTC m=+129.655519127 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/937bbb35-a3c2-435c-86c5-1072f3a54595-metrics-certs") pod "network-metrics-daemon-jkjwj" (UID: "937bbb35-a3c2-435c-86c5-1072f3a54595") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.757307 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"690b0de6-1f38-4265-bfff-2077a349f89c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sfztq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:07Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.766021 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jlxqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aad5923-f151-43de-a1a0-b8c6906c2d7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49f03929dd1e6f1510d46093e8ae9214eefc7c04d1db680cd69fde52c9344298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srhxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jlxqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:07Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.776478 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ce186c-640f-4ade-94e1-587c1440fe87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b5f36c8fa728c5c34b846e1b603069f07726a5213fdf0bf0ae8b6b12b676b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a78e5a0164b37185d7cf25f07c0ea5a1fd6df0d23a1ff173bf085817d2f672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cm8wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:07Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.790015 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qpbx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca0c39841636b80e8872853f9f5695cffa06ce37002e2eb03206a41a5b7be1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f5dqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qpbx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:07Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.803703 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:07Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.817964 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:07Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.829939 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:07Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.844840 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntzss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e5e2ad1-375b-4340-a583-e32742e736e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntzss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:07Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.858025 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://345b6aa5640e3a483968ee51953b07e13323f9a6ffed421509dc7a806d526100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:07Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.867450 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jkjwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"937bbb35-a3c2-435c-86c5-1072f3a54595\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jkjwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:07Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.876409 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vlk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dc23b1ea3fe03171bff4757ee01e66418bad442ac3a3489002239e543dbd6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccstl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vlk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:07Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.888167 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2472139744ba3e03c75f0553bd60e1cb207dc09b8476cb53d54ba4f4490080e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ec71ef97366504dfbfefb8cb3f687f41639740074ce3199b54d15ee328e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:07Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.899257 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:07Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.906002 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.906016 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.906004 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:37:07 crc kubenswrapper[4752]: E0227 17:37:07.906089 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.906002 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:37:07 crc kubenswrapper[4752]: E0227 17:37:07.906237 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:37:07 crc kubenswrapper[4752]: E0227 17:37:07.906313 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:37:07 crc kubenswrapper[4752]: E0227 17:37:07.906362 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:37:07 crc kubenswrapper[4752]: I0227 17:37:07.908271 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d3acd38b3425ce4a5d9befc6cc4e74b0192874e4bc99516544c8502e71be3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98009a8a13b1dfd8f9ceb1385dc6e4c9e209083771716793c0822fe287e5dfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jpsg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:07Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:08 crc kubenswrapper[4752]: I0227 17:37:08.499309 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" event={"ID":"690b0de6-1f38-4265-bfff-2077a349f89c","Type":"ContainerStarted","Data":"d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659"} Feb 27 17:37:08 crc kubenswrapper[4752]: I0227 17:37:08.499673 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" event={"ID":"690b0de6-1f38-4265-bfff-2077a349f89c","Type":"ContainerStarted","Data":"62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9"} Feb 27 17:37:08 crc kubenswrapper[4752]: I0227 17:37:08.499684 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" event={"ID":"690b0de6-1f38-4265-bfff-2077a349f89c","Type":"ContainerStarted","Data":"dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b"} Feb 27 17:37:08 crc kubenswrapper[4752]: I0227 17:37:08.499694 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" event={"ID":"690b0de6-1f38-4265-bfff-2077a349f89c","Type":"ContainerStarted","Data":"543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9"} Feb 27 17:37:08 crc kubenswrapper[4752]: I0227 17:37:08.499703 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" event={"ID":"690b0de6-1f38-4265-bfff-2077a349f89c","Type":"ContainerStarted","Data":"c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8"} Feb 27 17:37:08 crc kubenswrapper[4752]: I0227 17:37:08.502227 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ntzss" event={"ID":"3e5e2ad1-375b-4340-a583-e32742e736e6","Type":"ContainerStarted","Data":"7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb"} Feb 27 17:37:08 crc kubenswrapper[4752]: I0227 17:37:08.523622 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:08Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:08 crc kubenswrapper[4752]: I0227 17:37:08.542093 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:08Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:08 crc kubenswrapper[4752]: I0227 17:37:08.553899 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:08Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:08 crc kubenswrapper[4752]: I0227 17:37:08.568229 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntzss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e5e2ad1-375b-4340-a583-e32742e736e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntzss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:08Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:08 crc kubenswrapper[4752]: I0227 17:37:08.580071 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://345b6aa5640e3a483968ee51953b07e13323f9a6ffed421509dc7a806d526100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:08Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:08 crc kubenswrapper[4752]: I0227 17:37:08.590455 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jkjwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"937bbb35-a3c2-435c-86c5-1072f3a54595\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jkjwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:08Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:08 crc kubenswrapper[4752]: I0227 17:37:08.601878 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vlk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dc23b1ea3fe03171bff4757ee01e66418bad442ac3a3489002239e543dbd6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccstl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vlk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:08Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:08 crc kubenswrapper[4752]: I0227 17:37:08.616603 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2472139744ba3e03c75f0553bd60e1cb207dc09b8476cb53d54ba4f4490080e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ec71ef97366504dfbfefb8cb3f687f41639740074ce3199b54d15ee328e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:08Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:08 crc kubenswrapper[4752]: I0227 17:37:08.629951 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:08Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:08 crc kubenswrapper[4752]: I0227 17:37:08.644757 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d3acd38b3425ce4a5d9befc6cc4e74b0192874e4bc99516544c8502e71be3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98009a8a13b1dfd8f9ceb1385dc6e4c9e209083771716793c0822fe287e5dfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jpsg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:08Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:08 crc kubenswrapper[4752]: I0227 17:37:08.662520 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"690b0de6-1f38-4265-bfff-2077a349f89c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sfztq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:08Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:08 crc kubenswrapper[4752]: I0227 17:37:08.677181 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jlxqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aad5923-f151-43de-a1a0-b8c6906c2d7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49f03929dd1e6f1510d46093e8ae9214eefc7c04d1db680cd69fde52c9344298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srhxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jlxqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:08Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:08 crc kubenswrapper[4752]: I0227 17:37:08.692850 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ce186c-640f-4ade-94e1-587c1440fe87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b5f36c8fa728c5c34b846e1b603069f07726a5213fdf0bf0ae8b6b12b676b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a78e5a0164b37185d7cf25f07c0ea5a1fd6df0d23a1ff173bf085817d2f672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cm8wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:08Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:08 crc kubenswrapper[4752]: I0227 17:37:08.707951 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qpbx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca0c39841636b80e8872853f9f5695cffa06ce37002e2eb03206a41a5b7be1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f5dqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qpbx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:08Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.191048 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.191395 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.191404 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.191421 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.191431 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:37:09Z","lastTransitionTime":"2026-02-27T17:37:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:37:09 crc kubenswrapper[4752]: E0227 17:37:09.202428 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:09Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.205892 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.205931 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.205941 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.205960 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.206228 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:37:09Z","lastTransitionTime":"2026-02-27T17:37:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:37:09 crc kubenswrapper[4752]: E0227 17:37:09.218511 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:09Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.222318 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.222359 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.222370 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.222387 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.222402 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:37:09Z","lastTransitionTime":"2026-02-27T17:37:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:37:09 crc kubenswrapper[4752]: E0227 17:37:09.237639 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:09Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.241878 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.242083 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.242291 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.242587 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.242801 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:37:09Z","lastTransitionTime":"2026-02-27T17:37:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:37:09 crc kubenswrapper[4752]: E0227 17:37:09.256218 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:09Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.260663 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.260856 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.260985 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.261119 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.261281 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:37:09Z","lastTransitionTime":"2026-02-27T17:37:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:37:09 crc kubenswrapper[4752]: E0227 17:37:09.275074 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:09Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:09 crc kubenswrapper[4752]: E0227 17:37:09.275839 4752 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.510448 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" event={"ID":"690b0de6-1f38-4265-bfff-2077a349f89c","Type":"ContainerStarted","Data":"709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e"} Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.514278 4752 generic.go:334] "Generic (PLEG): container finished" podID="3e5e2ad1-375b-4340-a583-e32742e736e6" containerID="7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb" exitCode=0 Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.514938 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ntzss" event={"ID":"3e5e2ad1-375b-4340-a583-e32742e736e6","Type":"ContainerDied","Data":"7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb"} Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.517972 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"a772c7aa8838bea02d77323a30292917c5106470a965f45d100e33e60b3d8e4a"} Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.530992 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://345b6aa5640e3a483968ee51953b07e13323f9a6ffed421509dc7a806d526100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:09Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.548435 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jkjwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"937bbb35-a3c2-435c-86c5-1072f3a54595\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jkjwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:09Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.562713 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vlk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dc23b1ea3fe03171bff4757ee01e66418bad442ac3a3489002239e543dbd6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccstl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vlk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:09Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.577983 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2472139744ba3e03c75f0553bd60e1cb207dc09b8476cb53d54ba4f4490080e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ec71ef97366504dfbfefb8cb3f687f41639740074ce3199b54d15ee328e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:09Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.596620 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:09Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.610063 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d3acd38b3425ce4a5d9befc6cc4e74b0192874e4bc99516544c8502e71be3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98009a8a13b1dfd8f9ceb1385dc6e4c9e209083771716793c0822fe287e5dfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jpsg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:09Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.631032 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qpbx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca0c39841636b80e8872853f9f5695cffa06ce37002e2eb03206a41a5b7be1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f5dqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qpbx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:09Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.657489 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"690b0de6-1f38-4265-bfff-2077a349f89c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sfztq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:09Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.667642 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.667708 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.667748 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.667784 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:37:09 crc kubenswrapper[4752]: E0227 17:37:09.668310 4752 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 17:37:09 crc kubenswrapper[4752]: E0227 17:37:09.668404 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 17:37:13.668378045 +0000 UTC m=+133.575194896 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 17:37:09 crc kubenswrapper[4752]: E0227 17:37:09.668484 4752 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 17:37:09 crc kubenswrapper[4752]: E0227 17:37:09.668686 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 17:37:13.668610131 +0000 UTC m=+133.575427012 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 17:37:09 crc kubenswrapper[4752]: E0227 17:37:09.668761 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:37:13.668744185 +0000 UTC m=+133.575561026 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:37:09 crc kubenswrapper[4752]: E0227 17:37:09.668912 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 17:37:09 crc kubenswrapper[4752]: E0227 17:37:09.668957 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 17:37:09 crc kubenswrapper[4752]: E0227 17:37:09.668969 4752 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 17:37:09 crc kubenswrapper[4752]: E0227 17:37:09.669021 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-27 17:37:13.669006761 +0000 UTC m=+133.575823612 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.669108 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jlxqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aad5923-f151-43de-a1a0-b8c6906c2d7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49f03929dd1e6f1510d46093e8ae9214eefc7c04d1db680cd69fde52c9344298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srhxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jlxqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:09Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.681612 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ce186c-640f-4ade-94e1-587c1440fe87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b5f36c8fa728c5c34b846e1b603069f07726a5213fdf0bf0ae8b6b12b676b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a78e5a0164b37185d7cf25f07c0ea5a1fd6df0d23a1ff173bf085817d2f672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cm8wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:09Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.695500 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:09Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.707274 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:09Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.722338 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:09Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.746009 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntzss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e5e2ad1-375b-4340-a583-e32742e736e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntzss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:09Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.764208 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://345b6aa5640e3a483968ee51953b07e13323f9a6ffed421509dc7a806d526100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:09Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.768654 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.768720 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/937bbb35-a3c2-435c-86c5-1072f3a54595-metrics-certs\") pod \"network-metrics-daemon-jkjwj\" (UID: \"937bbb35-a3c2-435c-86c5-1072f3a54595\") " pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:37:09 crc kubenswrapper[4752]: E0227 17:37:09.768826 4752 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 17:37:09 crc kubenswrapper[4752]: E0227 17:37:09.768867 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 17:37:09 crc kubenswrapper[4752]: E0227 17:37:09.768899 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 17:37:09 crc kubenswrapper[4752]: E0227 17:37:09.768937 4752 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 17:37:09 crc kubenswrapper[4752]: E0227 17:37:09.768877 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/937bbb35-a3c2-435c-86c5-1072f3a54595-metrics-certs podName:937bbb35-a3c2-435c-86c5-1072f3a54595 nodeName:}" failed. No retries permitted until 2026-02-27 17:37:13.768862578 +0000 UTC m=+133.675679429 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/937bbb35-a3c2-435c-86c5-1072f3a54595-metrics-certs") pod "network-metrics-daemon-jkjwj" (UID: "937bbb35-a3c2-435c-86c5-1072f3a54595") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 17:37:09 crc kubenswrapper[4752]: E0227 17:37:09.769014 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-27 17:37:13.768996461 +0000 UTC m=+133.675813302 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.777709 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jkjwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"937bbb35-a3c2-435c-86c5-1072f3a54595\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jkjwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:09Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.790062 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vlk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dc23b1ea3fe03171bff4757ee01e66418bad442ac3a3489002239e543dbd6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccstl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vlk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:09Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.805807 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2472139744ba3e03c75f0553bd60e1cb207dc09b8476cb53d54ba4f4490080e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ec71ef97366504dfbfefb8cb3f687f41639740074ce3199b54d15ee328e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:09Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.826212 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:09Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.840836 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d3acd38b3425ce4a5d9befc6cc4e74b0192874e4bc99516544c8502e71be3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98009a8a13b1dfd8f9ceb1385dc6e4c9e209083771716793c0822fe287e5dfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jpsg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:09Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.871199 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"690b0de6-1f38-4265-bfff-2077a349f89c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sfztq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:09Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.883013 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jlxqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aad5923-f151-43de-a1a0-b8c6906c2d7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49f03929dd1e6f1510d46093e8ae9214eefc7c04d1db680cd69fde52c9344298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srhxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jlxqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:09Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.899949 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ce186c-640f-4ade-94e1-587c1440fe87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b5f36c8fa728c5c34b846e1b603069f07726a5213fdf0bf0ae8b6b12b676b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a78e5a0164b37185d7cf25f07c0ea5a1fd6df0d23a1ff173bf085817d2f672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cm8wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:09Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.906501 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.906562 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:09 crc kubenswrapper[4752]: E0227 17:37:09.906697 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.906677 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:37:09 crc kubenswrapper[4752]: E0227 17:37:09.906839 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.906869 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:37:09 crc kubenswrapper[4752]: E0227 17:37:09.906913 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:37:09 crc kubenswrapper[4752]: E0227 17:37:09.906967 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.916063 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qpbx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca0c39841636b80e8872853f9f5695cffa06ce37002e2eb03206a41a5b7be1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f5dqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qpbx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:09Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.924273 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.925491 4752 scope.go:117] "RemoveContainer" containerID="847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0" Feb 27 17:37:09 crc kubenswrapper[4752]: E0227 17:37:09.926031 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.936368 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:09Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.952231 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a772c7aa8838bea02d77323a30292917c5106470a965f45d100e33e60b3d8e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:09Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.970065 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:09Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:09 crc kubenswrapper[4752]: I0227 17:37:09.993565 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntzss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e5e2ad1-375b-4340-a583-e32742e736e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntzss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:09Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:10 crc kubenswrapper[4752]: I0227 17:37:10.524549 4752 generic.go:334] "Generic (PLEG): container finished" podID="3e5e2ad1-375b-4340-a583-e32742e736e6" containerID="e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6" exitCode=0 Feb 27 17:37:10 crc kubenswrapper[4752]: I0227 17:37:10.524632 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ntzss" event={"ID":"3e5e2ad1-375b-4340-a583-e32742e736e6","Type":"ContainerDied","Data":"e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6"} Feb 27 17:37:10 crc kubenswrapper[4752]: I0227 17:37:10.525827 4752 scope.go:117] "RemoveContainer" containerID="847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0" Feb 27 17:37:10 crc kubenswrapper[4752]: E0227 17:37:10.526107 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 17:37:10 crc kubenswrapper[4752]: I0227 17:37:10.543980 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jkjwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"937bbb35-a3c2-435c-86c5-1072f3a54595\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jkjwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:10Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:10 crc kubenswrapper[4752]: I0227 17:37:10.558424 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vlk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dc23b1ea3fe03171bff4757ee01e66418bad442ac3a3489002239e543dbd6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccstl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vlk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:10Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:10 crc kubenswrapper[4752]: I0227 17:37:10.577567 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://345b6aa5640e3a483968ee51953b07e13323f9a6ffed421509dc7a806d526100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:10Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:10 crc kubenswrapper[4752]: I0227 17:37:10.597730 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:10Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:10 crc kubenswrapper[4752]: I0227 17:37:10.612919 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d3acd38b3425ce4a5d9befc6cc4e74b0192874e4bc99516544c8502e71be3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98009a8a13b1dfd8f9ceb1385dc6e4c9e209083771716793c0822fe287e5dfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jpsg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:10Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:10 crc kubenswrapper[4752]: I0227 17:37:10.631213 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2472139744ba3e03c75f0553bd60e1cb207dc09b8476cb53d54ba4f4490080e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ec71ef97366504dfbfefb8cb3f687f41639740074ce3199b54d15ee328e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:10Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:10 crc kubenswrapper[4752]: I0227 17:37:10.653005 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"690b0de6-1f38-4265-bfff-2077a349f89c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sfztq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:10Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:10 crc kubenswrapper[4752]: I0227 17:37:10.665923 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jlxqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aad5923-f151-43de-a1a0-b8c6906c2d7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49f03929dd1e6f1510d46093e8ae9214eefc7c04d1db680cd69fde52c9344298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srhxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jlxqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:10Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:10 crc kubenswrapper[4752]: I0227 17:37:10.679557 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ce186c-640f-4ade-94e1-587c1440fe87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b5f36c8fa728c5c34b846e1b603069f07726a5213fdf0bf0ae8b6b12b676b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a78e5a0164b37185d7cf25f07c0ea5a1fd6df0d23a1ff173bf085817d2f672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cm8wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:10Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:10 crc kubenswrapper[4752]: I0227 17:37:10.696440 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qpbx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca0c39841636b80e8872853f9f5695cffa06ce37002e2eb03206a41a5b7be1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f5dqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qpbx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:10Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:10 crc kubenswrapper[4752]: I0227 17:37:10.709973 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1024425-74cb-401d-961a-72058a77a919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://269b9e60d7588f2345dfd731d886ef405b3ccb49e23ae2e676a3fa354a0c9d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba92a9228c8523b28a4a03f1f859339e21eac77405f37f5206e37ce4ee45d381\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9dd98b133cbae7c1dfa12a34dd6973dd4a5fc4340d34c1c4b3296d036456a82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T17:36:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 17:36:53.508646 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 17:36:53.508779 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 17:36:53.509492 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1825321194/tls.crt::/tmp/serving-cert-1825321194/tls.key\\\\\\\"\\\\nI0227 17:36:53.902105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 17:36:53.905975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 17:36:53.906023 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 17:36:53.906078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 17:36:53.906089 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 17:36:53.913727 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0227 17:36:53.913754 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913759 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913764 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 17:36:53.913768 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 17:36:53.913771 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 17:36:53.913774 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0227 17:36:53.913764 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0227 17:36:53.916552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c1fe8d41bdc04d8780aca50265c0e6cf43abbb1cda58aae8f78d54d7c52d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:10Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:10 crc kubenswrapper[4752]: I0227 17:37:10.722256 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a772c7aa8838bea02d77323a30292917c5106470a965f45d100e33e60b3d8e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:10Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:10 crc kubenswrapper[4752]: I0227 17:37:10.735340 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:10Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:10 crc kubenswrapper[4752]: I0227 17:37:10.750245 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntzss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e5e2ad1-375b-4340-a583-e32742e736e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntzss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:10Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:10 crc kubenswrapper[4752]: I0227 17:37:10.762929 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:10Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:10 crc kubenswrapper[4752]: I0227 17:37:10.925550 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://345b6aa5640e3a483968ee51953b07e13323f9a6ffed421509dc7a806d526100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:10Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:10 crc kubenswrapper[4752]: I0227 17:37:10.938706 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jkjwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"937bbb35-a3c2-435c-86c5-1072f3a54595\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jkjwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:10Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:10 crc kubenswrapper[4752]: I0227 17:37:10.946761 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vlk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dc23b1ea3fe03171bff4757ee01e66418bad442ac3a3489002239e543dbd6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccstl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vlk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:10Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:10 crc kubenswrapper[4752]: I0227 17:37:10.961906 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2472139744ba3e03c75f0553bd60e1cb207dc09b8476cb53d54ba4f4490080e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ec71ef97366504dfbfefb8cb3f687f41639740074ce3199b54d15ee328e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:10Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:10 crc kubenswrapper[4752]: I0227 17:37:10.974731 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:10Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:10 crc kubenswrapper[4752]: I0227 17:37:10.986108 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d3acd38b3425ce4a5d9befc6cc4e74b0192874e4bc99516544c8502e71be3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98009a8a13b1dfd8f9ceb1385dc6e4c9e209083771716793c0822fe287e5dfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jpsg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:10Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:10 crc kubenswrapper[4752]: I0227 17:37:10.999182 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1024425-74cb-401d-961a-72058a77a919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://269b9e60d7588f2345dfd731d886ef405b3ccb49e23ae2e676a3fa354a0c9d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba92a9228c8523b28a4a03f1f859339e21eac77405f37f5206e37ce4ee45d381\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9dd98b133cbae7c1dfa12a34dd6973dd4a5fc4340d34c1c4b3296d036456a82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T17:36:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 17:36:53.508646 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 17:36:53.508779 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 17:36:53.509492 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1825321194/tls.crt::/tmp/serving-cert-1825321194/tls.key\\\\\\\"\\\\nI0227 17:36:53.902105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 17:36:53.905975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 17:36:53.906023 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 17:36:53.906078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 17:36:53.906089 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 17:36:53.913727 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0227 17:36:53.913754 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913759 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913764 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 17:36:53.913768 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 17:36:53.913771 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 17:36:53.913774 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0227 17:36:53.913764 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0227 17:36:53.916552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c1fe8d41bdc04d8780aca50265c0e6cf43abbb1cda58aae8f78d54d7c52d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:10Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:11 crc kubenswrapper[4752]: E0227 17:37:11.013880 4752 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 17:37:11 crc kubenswrapper[4752]: I0227 17:37:11.021469 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"690b0de6-1f38-4265-bfff-2077a349f89c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sfztq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:11 crc kubenswrapper[4752]: I0227 17:37:11.035092 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jlxqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aad5923-f151-43de-a1a0-b8c6906c2d7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49f03929dd1e6f1510d46093e8ae9214eefc7c04d1db680cd69fde52c9344298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srhxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jlxqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:11 crc kubenswrapper[4752]: I0227 17:37:11.049764 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ce186c-640f-4ade-94e1-587c1440fe87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b5f36c8fa728c5c34b846e1b603069f07726a5213fdf0bf0ae8b6b12b676b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a78e5a0164b37185d7cf25f07c0ea5a1fd6df0d23a1ff173bf085817d2f672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cm8wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:11 crc kubenswrapper[4752]: I0227 17:37:11.065705 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qpbx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca0c39841636b80e8872853f9f5695cffa06ce37002e2eb03206a41a5b7be1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f5dqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qpbx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:11 crc kubenswrapper[4752]: I0227 17:37:11.082070 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:11 crc kubenswrapper[4752]: I0227 17:37:11.096506 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a772c7aa8838bea02d77323a30292917c5106470a965f45d100e33e60b3d8e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:11 crc kubenswrapper[4752]: I0227 17:37:11.110973 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:11 crc kubenswrapper[4752]: I0227 17:37:11.130352 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntzss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e5e2ad1-375b-4340-a583-e32742e736e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntzss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:11 crc kubenswrapper[4752]: I0227 17:37:11.542262 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" event={"ID":"690b0de6-1f38-4265-bfff-2077a349f89c","Type":"ContainerStarted","Data":"39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca"} Feb 27 17:37:11 crc kubenswrapper[4752]: I0227 17:37:11.546878 4752 generic.go:334] "Generic (PLEG): container finished" podID="3e5e2ad1-375b-4340-a583-e32742e736e6" containerID="67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2" exitCode=0 Feb 27 17:37:11 crc kubenswrapper[4752]: I0227 17:37:11.546937 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ntzss" event={"ID":"3e5e2ad1-375b-4340-a583-e32742e736e6","Type":"ContainerDied","Data":"67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2"} Feb 27 17:37:11 crc kubenswrapper[4752]: I0227 17:37:11.564408 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d3acd38b3425ce4a5d9befc6cc4e74b0192874e4bc99516544c8502e71be3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98009a8a13b1dfd8f9ceb1385dc6e4c9e209083771716793c0822fe287e5dfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jpsg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:11 crc kubenswrapper[4752]: I0227 17:37:11.586393 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2472139744ba3e03c75f0553bd60e1cb207dc09b8476cb53d54ba4f4490080e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ec71ef97366504dfbfefb8cb3f687f41639740074ce3199b54d15ee328e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:11 crc kubenswrapper[4752]: I0227 17:37:11.614558 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:11 crc kubenswrapper[4752]: I0227 17:37:11.634386 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jlxqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aad5923-f151-43de-a1a0-b8c6906c2d7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49f03929dd1e6f1510d46093e8ae9214eefc7c04d1db680cd69fde52c9344298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srhxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jlxqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:11 crc kubenswrapper[4752]: I0227 17:37:11.650392 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ce186c-640f-4ade-94e1-587c1440fe87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b5f36c8fa728c5c34b846e1b603069f07726a5213fdf0bf0ae8b6b12b676b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a78e5a0164b37185d7cf25f07c0ea5a1fd6df0d23a1ff173bf085817d2f672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cm8wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:11 crc kubenswrapper[4752]: I0227 17:37:11.667607 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qpbx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca0c39841636b80e8872853f9f5695cffa06ce37002e2eb03206a41a5b7be1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f5dqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qpbx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:11 crc kubenswrapper[4752]: I0227 17:37:11.682310 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1024425-74cb-401d-961a-72058a77a919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://269b9e60d7588f2345dfd731d886ef405b3ccb49e23ae2e676a3fa354a0c9d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba92a9228c8523b28a4a03f1f859339e21eac77405f37f5206e37ce4ee45d381\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9dd98b133cbae7c1dfa12a34dd6973dd4a5fc4340d34c1c4b3296d036456a82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T17:36:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 17:36:53.508646 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 17:36:53.508779 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 17:36:53.509492 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1825321194/tls.crt::/tmp/serving-cert-1825321194/tls.key\\\\\\\"\\\\nI0227 17:36:53.902105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 17:36:53.905975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 17:36:53.906023 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 17:36:53.906078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 17:36:53.906089 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 17:36:53.913727 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0227 17:36:53.913754 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913759 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913764 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 17:36:53.913768 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 17:36:53.913771 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 17:36:53.913774 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0227 17:36:53.913764 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0227 17:36:53.916552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c1fe8d41bdc04d8780aca50265c0e6cf43abbb1cda58aae8f78d54d7c52d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:11 crc kubenswrapper[4752]: I0227 17:37:11.710337 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"690b0de6-1f38-4265-bfff-2077a349f89c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sfztq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:11 crc kubenswrapper[4752]: I0227 17:37:11.727695 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:11 crc kubenswrapper[4752]: I0227 17:37:11.747598 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntzss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e5e2ad1-375b-4340-a583-e32742e736e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntzss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:11 crc kubenswrapper[4752]: I0227 17:37:11.763750 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:11 crc kubenswrapper[4752]: I0227 17:37:11.778140 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a772c7aa8838bea02d77323a30292917c5106470a965f45d100e33e60b3d8e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:11 crc kubenswrapper[4752]: I0227 17:37:11.788747 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vlk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dc23b1ea3fe03171bff4757ee01e66418bad442ac3a3489002239e543dbd6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccstl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vlk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:11 crc kubenswrapper[4752]: I0227 17:37:11.806018 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://345b6aa5640e3a483968ee51953b07e13323f9a6ffed421509dc7a806d526100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:11 crc kubenswrapper[4752]: I0227 17:37:11.820959 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jkjwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"937bbb35-a3c2-435c-86c5-1072f3a54595\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jkjwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:11 crc kubenswrapper[4752]: I0227 17:37:11.906416 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:11 crc kubenswrapper[4752]: I0227 17:37:11.906460 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:37:11 crc kubenswrapper[4752]: I0227 17:37:11.906472 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:37:11 crc kubenswrapper[4752]: E0227 17:37:11.906605 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:37:11 crc kubenswrapper[4752]: I0227 17:37:11.906429 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:37:11 crc kubenswrapper[4752]: E0227 17:37:11.906786 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:37:11 crc kubenswrapper[4752]: E0227 17:37:11.906852 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:37:11 crc kubenswrapper[4752]: E0227 17:37:11.906897 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:37:12 crc kubenswrapper[4752]: I0227 17:37:12.554126 4752 generic.go:334] "Generic (PLEG): container finished" podID="3e5e2ad1-375b-4340-a583-e32742e736e6" containerID="57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297" exitCode=0 Feb 27 17:37:12 crc kubenswrapper[4752]: I0227 17:37:12.554192 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ntzss" event={"ID":"3e5e2ad1-375b-4340-a583-e32742e736e6","Type":"ContainerDied","Data":"57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297"} Feb 27 17:37:12 crc kubenswrapper[4752]: I0227 17:37:12.570734 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2472139744ba3e03c75f0553bd60e1cb207dc09b8476cb53d54ba4f4490080e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ec71ef97366504dfbfefb8cb3f687f41639740074ce3199b54d15ee328e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:12Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:12 crc kubenswrapper[4752]: I0227 17:37:12.587018 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:12Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:12 crc kubenswrapper[4752]: I0227 17:37:12.600041 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d3acd38b3425ce4a5d9befc6cc4e74b0192874e4bc99516544c8502e71be3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98009a8a13b1dfd8f9ceb1385dc6e4c9e209083771716793c0822fe287e5dfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jpsg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:12Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:12 crc kubenswrapper[4752]: I0227 17:37:12.621530 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1024425-74cb-401d-961a-72058a77a919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://269b9e60d7588f2345dfd731d886ef405b3ccb49e23ae2e676a3fa354a0c9d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba92a9228c8523b28a4a03f1f859339e21eac77405f37f5206e37ce4ee45d381\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9dd98b133cbae7c1dfa12a34dd6973dd4a5fc4340d34c1c4b3296d036456a82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T17:36:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 17:36:53.508646 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 17:36:53.508779 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 17:36:53.509492 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1825321194/tls.crt::/tmp/serving-cert-1825321194/tls.key\\\\\\\"\\\\nI0227 17:36:53.902105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 17:36:53.905975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 17:36:53.906023 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 17:36:53.906078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 17:36:53.906089 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 17:36:53.913727 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0227 17:36:53.913754 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913759 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913764 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 17:36:53.913768 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 17:36:53.913771 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 17:36:53.913774 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0227 17:36:53.913764 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0227 17:36:53.916552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c1fe8d41bdc04d8780aca50265c0e6cf43abbb1cda58aae8f78d54d7c52d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:12Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:12 crc kubenswrapper[4752]: I0227 17:37:12.642905 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"690b0de6-1f38-4265-bfff-2077a349f89c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sfztq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:12Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:12 crc kubenswrapper[4752]: I0227 17:37:12.654074 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jlxqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aad5923-f151-43de-a1a0-b8c6906c2d7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49f03929dd1e6f1510d46093e8ae9214eefc7c04d1db680cd69fde52c9344298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srhxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jlxqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:12Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:12 crc kubenswrapper[4752]: I0227 17:37:12.665170 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ce186c-640f-4ade-94e1-587c1440fe87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b5f36c8fa728c5c34b846e1b603069f07726a5213fdf0bf0ae8b6b12b676b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a78e5a0164b37185d7cf25f07c0ea5a1fd6df0d23a1ff173bf085817d2f672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cm8wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:12Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:12 crc kubenswrapper[4752]: I0227 17:37:12.679515 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qpbx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca0c39841636b80e8872853f9f5695cffa06ce37002e2eb03206a41a5b7be1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f5dqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qpbx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:12Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:12 crc kubenswrapper[4752]: I0227 17:37:12.696000 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:12Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:12 crc kubenswrapper[4752]: I0227 17:37:12.709850 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a772c7aa8838bea02d77323a30292917c5106470a965f45d100e33e60b3d8e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:12Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:12 crc kubenswrapper[4752]: I0227 17:37:12.724208 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:12Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:12 crc kubenswrapper[4752]: I0227 17:37:12.742760 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntzss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e5e2ad1-375b-4340-a583-e32742e736e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntzss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:12Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:12 crc kubenswrapper[4752]: I0227 17:37:12.763578 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://345b6aa5640e3a483968ee51953b07e13323f9a6ffed421509dc7a806d526100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:12Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:12 crc kubenswrapper[4752]: I0227 17:37:12.776784 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jkjwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"937bbb35-a3c2-435c-86c5-1072f3a54595\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jkjwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:12Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:12 crc kubenswrapper[4752]: I0227 17:37:12.787247 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vlk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dc23b1ea3fe03171bff4757ee01e66418bad442ac3a3489002239e543dbd6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccstl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vlk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:12Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.573759 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" event={"ID":"690b0de6-1f38-4265-bfff-2077a349f89c","Type":"ContainerStarted","Data":"d6d7ef152ac9a97f6fb9d6e3fe0a8f3b241e2ab95a48addc8ce017b8363a6057"} Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.576001 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.576054 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.576073 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.580974 4752 generic.go:334] "Generic (PLEG): container finished" podID="3e5e2ad1-375b-4340-a583-e32742e736e6" containerID="590b059f184b66a468123967bf7fe298308ae95e481a3e9d9489cbebc4034a82" exitCode=0 Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.581055 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ntzss" event={"ID":"3e5e2ad1-375b-4340-a583-e32742e736e6","Type":"ContainerDied","Data":"590b059f184b66a468123967bf7fe298308ae95e481a3e9d9489cbebc4034a82"} Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.603625 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://345b6aa5640e3a483968ee51953b07e13323f9a6ffed421509dc7a806d526100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:13Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.619814 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jkjwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"937bbb35-a3c2-435c-86c5-1072f3a54595\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jkjwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:13Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.635985 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.637087 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vlk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dc23b1ea3fe03171bff4757ee01e66418bad442ac3a3489002239e543dbd6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccstl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vlk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:13Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.638281 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.655532 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2472139744ba3e03c75f0553bd60e1cb207dc09b8476cb53d54ba4f4490080e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ec71ef97366504dfbfefb8cb3f687f41639740074ce3199b54d15ee328e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:13Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.668784 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:13Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.679884 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d3acd38b3425ce4a5d9befc6cc4e74b0192874e4bc99516544c8502e71be3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98009a8a13b1dfd8f9ceb1385dc6e4c9e209083771716793c0822fe287e5dfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jpsg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:13Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.695829 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1024425-74cb-401d-961a-72058a77a919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://269b9e60d7588f2345dfd731d886ef405b3ccb49e23ae2e676a3fa354a0c9d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba92a9228c8523b28a4a03f1f859339e21eac77405f37f5206e37ce4ee45d381\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9dd98b133cbae7c1dfa12a34dd6973dd4a5fc4340d34c1c4b3296d036456a82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T17:36:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 17:36:53.508646 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 17:36:53.508779 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 17:36:53.509492 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1825321194/tls.crt::/tmp/serving-cert-1825321194/tls.key\\\\\\\"\\\\nI0227 17:36:53.902105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 17:36:53.905975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 17:36:53.906023 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 17:36:53.906078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 17:36:53.906089 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 17:36:53.913727 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0227 17:36:53.913754 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913759 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913764 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 17:36:53.913768 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 17:36:53.913771 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 17:36:53.913774 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0227 17:36:53.913764 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0227 17:36:53.916552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c1fe8d41bdc04d8780aca50265c0e6cf43abbb1cda58aae8f78d54d7c52d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:13Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.711701 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.711868 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.711942 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.712010 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:13 crc kubenswrapper[4752]: E0227 17:37:13.712125 4752 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 17:37:13 crc kubenswrapper[4752]: E0227 17:37:13.712235 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 17:37:21.712213655 +0000 UTC m=+141.619030546 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 17:37:13 crc kubenswrapper[4752]: E0227 17:37:13.712704 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:37:21.712683316 +0000 UTC m=+141.619500207 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:37:13 crc kubenswrapper[4752]: E0227 17:37:13.712808 4752 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 17:37:13 crc kubenswrapper[4752]: E0227 17:37:13.712865 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 17:37:21.7128488 +0000 UTC m=+141.619665691 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 17:37:13 crc kubenswrapper[4752]: E0227 17:37:13.712963 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 17:37:13 crc kubenswrapper[4752]: E0227 17:37:13.712985 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 17:37:13 crc kubenswrapper[4752]: E0227 17:37:13.713003 4752 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 17:37:13 crc kubenswrapper[4752]: E0227 17:37:13.713045 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-27 17:37:21.713031875 +0000 UTC m=+141.619848766 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.715655 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"690b0de6-1f38-4265-bfff-2077a349f89c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6d7ef152ac9a97f6fb9d6e3fe0a8f3b241e2ab95a48addc8ce017b8363a6057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sfztq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:13Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.726609 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jlxqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aad5923-f151-43de-a1a0-b8c6906c2d7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49f03929dd1e6f1510d46093e8ae9214eefc7c04d1db680cd69fde52c9344298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srhxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jlxqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:13Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.738291 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ce186c-640f-4ade-94e1-587c1440fe87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b5f36c8fa728c5c34b846e1b603069f07726a5213fdf0bf0ae8b6b12b676b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a78e5a0164b37185d7cf25f07c0ea5a1fd6df0d23a1ff173bf085817d2f672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cm8wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:13Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.754306 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qpbx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca0c39841636b80e8872853f9f5695cffa06ce37002e2eb03206a41a5b7be1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f5dqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qpbx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:13Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.767185 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:13Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.779387 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a772c7aa8838bea02d77323a30292917c5106470a965f45d100e33e60b3d8e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:13Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.789285 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:13Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.800996 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntzss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e5e2ad1-375b-4340-a583-e32742e736e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntzss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:13Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.812702 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.812763 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/937bbb35-a3c2-435c-86c5-1072f3a54595-metrics-certs\") pod \"network-metrics-daemon-jkjwj\" (UID: \"937bbb35-a3c2-435c-86c5-1072f3a54595\") " pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:37:13 crc kubenswrapper[4752]: E0227 17:37:13.812857 4752 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 17:37:13 crc kubenswrapper[4752]: E0227 17:37:13.812921 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/937bbb35-a3c2-435c-86c5-1072f3a54595-metrics-certs podName:937bbb35-a3c2-435c-86c5-1072f3a54595 nodeName:}" failed. No retries permitted until 2026-02-27 17:37:21.812904931 +0000 UTC m=+141.719721792 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/937bbb35-a3c2-435c-86c5-1072f3a54595-metrics-certs") pod "network-metrics-daemon-jkjwj" (UID: "937bbb35-a3c2-435c-86c5-1072f3a54595") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 17:37:13 crc kubenswrapper[4752]: E0227 17:37:13.813278 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 17:37:13 crc kubenswrapper[4752]: E0227 17:37:13.813309 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 17:37:13 crc kubenswrapper[4752]: E0227 17:37:13.813329 4752 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 17:37:13 crc kubenswrapper[4752]: E0227 17:37:13.813383 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-27 17:37:21.813371833 +0000 UTC m=+141.720188694 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.821792 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"690b0de6-1f38-4265-bfff-2077a349f89c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6d7ef152ac9a97f6fb9d6e3fe0a8f3b241e2ab95a48addc8ce017b8363a6057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sfztq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:13Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.833721 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jlxqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aad5923-f151-43de-a1a0-b8c6906c2d7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49f03929dd1e6f1510d46093e8ae9214eefc7c04d1db680cd69fde52c9344298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srhxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jlxqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:13Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.845180 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ce186c-640f-4ade-94e1-587c1440fe87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b5f36c8fa728c5c34b846e1b603069f07726a5213fdf0bf0ae8b6b12b676b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a78e5a0164b37185d7cf25f07c0ea5a1fd6df0d23a1ff173bf085817d2f672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cm8wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:13Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.856694 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qpbx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca0c39841636b80e8872853f9f5695cffa06ce37002e2eb03206a41a5b7be1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f5dqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qpbx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:13Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.868302 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1024425-74cb-401d-961a-72058a77a919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://269b9e60d7588f2345dfd731d886ef405b3ccb49e23ae2e676a3fa354a0c9d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba92a9228c8523b28a4a03f1f859339e21eac77405f37f5206e37ce4ee45d381\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9dd98b133cbae7c1dfa12a34dd6973dd4a5fc4340d34c1c4b3296d036456a82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T17:36:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 17:36:53.508646 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 17:36:53.508779 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 17:36:53.509492 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1825321194/tls.crt::/tmp/serving-cert-1825321194/tls.key\\\\\\\"\\\\nI0227 17:36:53.902105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 17:36:53.905975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 17:36:53.906023 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 17:36:53.906078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 17:36:53.906089 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 17:36:53.913727 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0227 17:36:53.913754 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913759 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913764 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 17:36:53.913768 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 17:36:53.913771 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 17:36:53.913774 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0227 17:36:53.913764 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0227 17:36:53.916552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c1fe8d41bdc04d8780aca50265c0e6cf43abbb1cda58aae8f78d54d7c52d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:13Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.886329 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:13Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.901234 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a772c7aa8838bea02d77323a30292917c5106470a965f45d100e33e60b3d8e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:13Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.906021 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:37:13 crc kubenswrapper[4752]: E0227 17:37:13.906165 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.906323 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.906334 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.906502 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:37:13 crc kubenswrapper[4752]: E0227 17:37:13.906417 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:37:13 crc kubenswrapper[4752]: E0227 17:37:13.906521 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:37:13 crc kubenswrapper[4752]: E0227 17:37:13.906710 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.913137 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:13Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.932838 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntzss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e5e2ad1-375b-4340-a583-e32742e736e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://590b059f184b66a468123967bf7fe298308ae95e481a3e9d9489cbebc4034a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590b059f184b66a468123967bf7fe298308ae95e481a3e9d9489cbebc4034a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntzss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:13Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.948574 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://345b6aa5640e3a483968ee51953b07e13323f9a6ffed421509dc7a806d526100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:13Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.957348 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jkjwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"937bbb35-a3c2-435c-86c5-1072f3a54595\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jkjwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:13Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.969632 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vlk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dc23b1ea3fe03171bff4757ee01e66418bad442ac3a3489002239e543dbd6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccstl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vlk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:13Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:13 crc kubenswrapper[4752]: I0227 17:37:13.987758 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2472139744ba3e03c75f0553bd60e1cb207dc09b8476cb53d54ba4f4490080e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ec71ef97366504dfbfefb8cb3f687f41639740074ce3199b54d15ee328e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:13Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:14 crc kubenswrapper[4752]: I0227 17:37:14.002998 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:14Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:14 crc kubenswrapper[4752]: I0227 17:37:14.019356 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d3acd38b3425ce4a5d9befc6cc4e74b0192874e4bc99516544c8502e71be3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98009a8a13b1dfd8f9ceb1385dc6e4c9e209083771716793c0822fe287e5dfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jpsg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:14Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:14 crc kubenswrapper[4752]: I0227 17:37:14.592783 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ntzss" event={"ID":"3e5e2ad1-375b-4340-a583-e32742e736e6","Type":"ContainerStarted","Data":"252ad8ab6d056c4b6aa5591b2eb7b61105a3e0ac19b98097f6bd5766b98c7f7f"} Feb 27 17:37:14 crc kubenswrapper[4752]: I0227 17:37:14.621377 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1024425-74cb-401d-961a-72058a77a919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://269b9e60d7588f2345dfd731d886ef405b3ccb49e23ae2e676a3fa354a0c9d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba92a9228c8523b28a4a03f1f859339e21eac77405f37f5206e37ce4ee45d381\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9dd98b133cbae7c1dfa12a34dd6973dd4a5fc4340d34c1c4b3296d036456a82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T17:36:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 17:36:53.508646 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 17:36:53.508779 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 17:36:53.509492 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1825321194/tls.crt::/tmp/serving-cert-1825321194/tls.key\\\\\\\"\\\\nI0227 17:36:53.902105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 17:36:53.905975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 17:36:53.906023 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 17:36:53.906078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 17:36:53.906089 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 17:36:53.913727 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0227 17:36:53.913754 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913759 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913764 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 17:36:53.913768 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 17:36:53.913771 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 17:36:53.913774 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0227 17:36:53.913764 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0227 17:36:53.916552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c1fe8d41bdc04d8780aca50265c0e6cf43abbb1cda58aae8f78d54d7c52d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:14Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:14 crc kubenswrapper[4752]: I0227 17:37:14.658529 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"690b0de6-1f38-4265-bfff-2077a349f89c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6d7ef152ac9a97f6fb9d6e3fe0a8f3b241e2ab95a48addc8ce017b8363a6057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sfztq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:14Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:14 crc kubenswrapper[4752]: I0227 17:37:14.675253 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jlxqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aad5923-f151-43de-a1a0-b8c6906c2d7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49f03929dd1e6f1510d46093e8ae9214eefc7c04d1db680cd69fde52c9344298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srhxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jlxqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:14Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:14 crc kubenswrapper[4752]: I0227 17:37:14.690863 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ce186c-640f-4ade-94e1-587c1440fe87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b5f36c8fa728c5c34b846e1b603069f07726a5213fdf0bf0ae8b6b12b676b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a78e5a0164b37185d7cf25f07c0ea5a1fd6df0d23a1ff173bf085817d2f672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cm8wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:14Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:14 crc kubenswrapper[4752]: I0227 17:37:14.718827 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qpbx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca0c39841636b80e8872853f9f5695cffa06ce37002e2eb03206a41a5b7be1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f5dqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qpbx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:14Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:14 crc kubenswrapper[4752]: I0227 17:37:14.738751 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:14Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:14 crc kubenswrapper[4752]: I0227 17:37:14.754669 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a772c7aa8838bea02d77323a30292917c5106470a965f45d100e33e60b3d8e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:14Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:14 crc kubenswrapper[4752]: I0227 17:37:14.773086 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:14Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:14 crc kubenswrapper[4752]: I0227 17:37:14.790256 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntzss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e5e2ad1-375b-4340-a583-e32742e736e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://252ad8ab6d056c4b6aa5591b2eb7b61105a3e0ac19b98097f6bd5766b98c7f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://590b059f184b66a468123967bf7fe298308ae95e481a3e9d9489cbebc4034a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590b059f184b66a468123967bf7fe298308ae95e481a3e9d9489cbebc4034a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntzss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:14Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:14 crc kubenswrapper[4752]: I0227 17:37:14.816254 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://345b6aa5640e3a483968ee51953b07e13323f9a6ffed421509dc7a806d526100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:14Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:14 crc kubenswrapper[4752]: I0227 17:37:14.844132 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jkjwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"937bbb35-a3c2-435c-86c5-1072f3a54595\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jkjwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:14Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:14 crc kubenswrapper[4752]: I0227 17:37:14.862350 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vlk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dc23b1ea3fe03171bff4757ee01e66418bad442ac3a3489002239e543dbd6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccstl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vlk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:14Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:14 crc kubenswrapper[4752]: I0227 17:37:14.876633 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2472139744ba3e03c75f0553bd60e1cb207dc09b8476cb53d54ba4f4490080e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ec71ef97366504dfbfefb8cb3f687f41639740074ce3199b54d15ee328e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:14Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:14 crc kubenswrapper[4752]: I0227 17:37:14.891512 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:14Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:14 crc kubenswrapper[4752]: I0227 17:37:14.905344 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d3acd38b3425ce4a5d9befc6cc4e74b0192874e4bc99516544c8502e71be3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98009a8a13b1dfd8f9ceb1385dc6e4c9e209083771716793c0822fe287e5dfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jpsg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:14Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:15 crc kubenswrapper[4752]: I0227 17:37:15.906008 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:37:15 crc kubenswrapper[4752]: I0227 17:37:15.906104 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:37:15 crc kubenswrapper[4752]: E0227 17:37:15.906208 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:37:15 crc kubenswrapper[4752]: E0227 17:37:15.906333 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:37:15 crc kubenswrapper[4752]: I0227 17:37:15.906008 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:15 crc kubenswrapper[4752]: E0227 17:37:15.906473 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:37:15 crc kubenswrapper[4752]: I0227 17:37:15.906607 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:37:15 crc kubenswrapper[4752]: E0227 17:37:15.906694 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:37:16 crc kubenswrapper[4752]: E0227 17:37:16.015500 4752 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 17:37:16 crc kubenswrapper[4752]: I0227 17:37:16.604271 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sfztq_690b0de6-1f38-4265-bfff-2077a349f89c/ovnkube-controller/0.log" Feb 27 17:37:16 crc kubenswrapper[4752]: I0227 17:37:16.609462 4752 generic.go:334] "Generic (PLEG): container finished" podID="690b0de6-1f38-4265-bfff-2077a349f89c" containerID="d6d7ef152ac9a97f6fb9d6e3fe0a8f3b241e2ab95a48addc8ce017b8363a6057" exitCode=1 Feb 27 17:37:16 crc kubenswrapper[4752]: I0227 17:37:16.609512 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" event={"ID":"690b0de6-1f38-4265-bfff-2077a349f89c","Type":"ContainerDied","Data":"d6d7ef152ac9a97f6fb9d6e3fe0a8f3b241e2ab95a48addc8ce017b8363a6057"} Feb 27 17:37:16 crc kubenswrapper[4752]: I0227 17:37:16.611384 4752 scope.go:117] "RemoveContainer" containerID="d6d7ef152ac9a97f6fb9d6e3fe0a8f3b241e2ab95a48addc8ce017b8363a6057" Feb 27 17:37:16 crc kubenswrapper[4752]: I0227 17:37:16.630853 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:16Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:16 crc kubenswrapper[4752]: I0227 17:37:16.653368 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a772c7aa8838bea02d77323a30292917c5106470a965f45d100e33e60b3d8e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:16Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:16 crc kubenswrapper[4752]: I0227 17:37:16.670263 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:16Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:16 crc kubenswrapper[4752]: I0227 17:37:16.696440 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntzss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e5e2ad1-375b-4340-a583-e32742e736e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://252ad8ab6d056c4b6aa5591b2eb7b61105a3e0ac19b98097f6bd5766b98c7f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://590b059f184b66a468123967bf7fe298308ae95e481a3e9d9489cbebc4034a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590b059f184b66a468123967bf7fe298308ae95e481a3e9d9489cbebc4034a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntzss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:16Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:16 crc kubenswrapper[4752]: I0227 17:37:16.717454 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://345b6aa5640e3a483968ee51953b07e13323f9a6ffed421509dc7a806d526100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:16Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:16 crc kubenswrapper[4752]: I0227 17:37:16.744876 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jkjwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"937bbb35-a3c2-435c-86c5-1072f3a54595\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jkjwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:16Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:16 crc kubenswrapper[4752]: I0227 17:37:16.763097 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vlk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dc23b1ea3fe03171bff4757ee01e66418bad442ac3a3489002239e543dbd6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccstl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vlk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:16Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:16 crc kubenswrapper[4752]: I0227 17:37:16.779697 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2472139744ba3e03c75f0553bd60e1cb207dc09b8476cb53d54ba4f4490080e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ec71ef97366504dfbfefb8cb3f687f41639740074ce3199b54d15ee328e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:16Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:16 crc kubenswrapper[4752]: I0227 17:37:16.801957 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:16Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:16 crc kubenswrapper[4752]: I0227 17:37:16.821104 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d3acd38b3425ce4a5d9befc6cc4e74b0192874e4bc99516544c8502e71be3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98009a8a13b1dfd8f9ceb1385dc6e4c9e209083771716793c0822fe287e5dfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jpsg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:16Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:16 crc kubenswrapper[4752]: I0227 17:37:16.844870 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1024425-74cb-401d-961a-72058a77a919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://269b9e60d7588f2345dfd731d886ef405b3ccb49e23ae2e676a3fa354a0c9d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba92a9228c8523b28a4a03f1f859339e21eac77405f37f5206e37ce4ee45d381\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9dd98b133cbae7c1dfa12a34dd6973dd4a5fc4340d34c1c4b3296d036456a82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T17:36:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 17:36:53.508646 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 17:36:53.508779 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 17:36:53.509492 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1825321194/tls.crt::/tmp/serving-cert-1825321194/tls.key\\\\\\\"\\\\nI0227 17:36:53.902105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 17:36:53.905975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 17:36:53.906023 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 17:36:53.906078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 17:36:53.906089 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 17:36:53.913727 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0227 17:36:53.913754 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913759 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913764 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 17:36:53.913768 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 17:36:53.913771 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 17:36:53.913774 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0227 17:36:53.913764 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0227 17:36:53.916552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c1fe8d41bdc04d8780aca50265c0e6cf43abbb1cda58aae8f78d54d7c52d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:16Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:16 crc kubenswrapper[4752]: I0227 17:37:16.875390 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"690b0de6-1f38-4265-bfff-2077a349f89c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6d7ef152ac9a97f6fb9d6e3fe0a8f3b241e2ab95a48addc8ce017b8363a6057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d7ef152ac9a97f6fb9d6e3fe0a8f3b241e2ab95a48addc8ce017b8363a6057\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T17:37:15Z\\\",\\\"message\\\":\\\"8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 17:37:15.931246 6708 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0227 17:37:15.931295 6708 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0227 17:37:15.931318 6708 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 17:37:15.931336 6708 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 17:37:15.931361 6708 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 17:37:15.931378 6708 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 17:37:15.931391 6708 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 17:37:15.931391 6708 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 17:37:15.931421 6708 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0227 17:37:15.931442 6708 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 17:37:15.931440 6708 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 17:37:15.931460 6708 factory.go:656] Stopping watch factory\\\\nI0227 17:37:15.931484 6708 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 17:37:15.931492 6708 ovnkube.go:599] Stopped ovnkube\\\\nI0227 17:37:15.931504 6708 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0227 17:37:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sfztq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:16Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:16 crc kubenswrapper[4752]: I0227 17:37:16.888008 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jlxqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aad5923-f151-43de-a1a0-b8c6906c2d7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49f03929dd1e6f1510d46093e8ae9214eefc7c04d1db680cd69fde52c9344298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srhxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jlxqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:16Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:16 crc kubenswrapper[4752]: I0227 17:37:16.899427 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ce186c-640f-4ade-94e1-587c1440fe87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b5f36c8fa728c5c34b846e1b603069f07726a5213fdf0bf0ae8b6b12b676b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a78e5a0164b37185d7cf25f07c0ea5a1fd6df0d23a1ff173bf085817d2f672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cm8wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:16Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:16 crc kubenswrapper[4752]: I0227 17:37:16.913721 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qpbx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca0c39841636b80e8872853f9f5695cffa06ce37002e2eb03206a41a5b7be1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f5dqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qpbx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:16Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:17 crc kubenswrapper[4752]: I0227 17:37:17.620290 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sfztq_690b0de6-1f38-4265-bfff-2077a349f89c/ovnkube-controller/1.log" Feb 27 17:37:17 crc kubenswrapper[4752]: I0227 17:37:17.620785 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sfztq_690b0de6-1f38-4265-bfff-2077a349f89c/ovnkube-controller/0.log" Feb 27 17:37:17 crc kubenswrapper[4752]: I0227 17:37:17.624985 4752 generic.go:334] "Generic (PLEG): container finished" podID="690b0de6-1f38-4265-bfff-2077a349f89c" containerID="73c06a667cc5d076b716fa5bcf723d492ff26bc1bf27874e1f7fea7e5d7e7696" exitCode=1 Feb 27 17:37:17 crc kubenswrapper[4752]: I0227 17:37:17.625095 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" event={"ID":"690b0de6-1f38-4265-bfff-2077a349f89c","Type":"ContainerDied","Data":"73c06a667cc5d076b716fa5bcf723d492ff26bc1bf27874e1f7fea7e5d7e7696"} Feb 27 17:37:17 crc kubenswrapper[4752]: I0227 17:37:17.625166 4752 scope.go:117] "RemoveContainer" containerID="d6d7ef152ac9a97f6fb9d6e3fe0a8f3b241e2ab95a48addc8ce017b8363a6057" Feb 27 17:37:17 crc kubenswrapper[4752]: I0227 17:37:17.626167 4752 scope.go:117] "RemoveContainer" containerID="73c06a667cc5d076b716fa5bcf723d492ff26bc1bf27874e1f7fea7e5d7e7696" Feb 27 17:37:17 crc kubenswrapper[4752]: E0227 17:37:17.626438 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-sfztq_openshift-ovn-kubernetes(690b0de6-1f38-4265-bfff-2077a349f89c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" Feb 27 17:37:17 crc kubenswrapper[4752]: I0227 17:37:17.640263 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ce186c-640f-4ade-94e1-587c1440fe87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b5f36c8fa728c5c34b846e1b603069f07726a5213fdf0bf0ae8b6b12b676b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a78e5a0164b37185d7cf25f07c0ea5a1fd6df0d23a1ff173bf085817d2f672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cm8wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:17Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:17 crc kubenswrapper[4752]: I0227 17:37:17.653679 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qpbx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca0c39841636b80e8872853f9f5695cffa06ce37002e2eb03206a41a5b7be1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f5dqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qpbx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:17Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:17 crc kubenswrapper[4752]: I0227 17:37:17.667415 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1024425-74cb-401d-961a-72058a77a919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://269b9e60d7588f2345dfd731d886ef405b3ccb49e23ae2e676a3fa354a0c9d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba92a9228c8523b28a4a03f1f859339e21eac77405f37f5206e37ce4ee45d381\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9dd98b133cbae7c1dfa12a34dd6973dd4a5fc4340d34c1c4b3296d036456a82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T17:36:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 17:36:53.508646 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 17:36:53.508779 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 17:36:53.509492 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1825321194/tls.crt::/tmp/serving-cert-1825321194/tls.key\\\\\\\"\\\\nI0227 17:36:53.902105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 17:36:53.905975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 17:36:53.906023 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 17:36:53.906078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 17:36:53.906089 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 17:36:53.913727 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0227 17:36:53.913754 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913759 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913764 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 17:36:53.913768 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 17:36:53.913771 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 17:36:53.913774 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0227 17:36:53.913764 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0227 17:36:53.916552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c1fe8d41bdc04d8780aca50265c0e6cf43abbb1cda58aae8f78d54d7c52d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:17Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:17 crc kubenswrapper[4752]: I0227 17:37:17.690356 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"690b0de6-1f38-4265-bfff-2077a349f89c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c06a667cc5d076b716fa5bcf723d492ff26bc1bf27874e1f7fea7e5d7e7696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d7ef152ac9a97f6fb9d6e3fe0a8f3b241e2ab95a48addc8ce017b8363a6057\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T17:37:15Z\\\",\\\"message\\\":\\\"8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 17:37:15.931246 6708 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0227 17:37:15.931295 6708 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0227 17:37:15.931318 6708 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 17:37:15.931336 6708 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 17:37:15.931361 6708 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 17:37:15.931378 6708 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 17:37:15.931391 6708 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 17:37:15.931391 6708 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 17:37:15.931421 6708 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0227 17:37:15.931442 6708 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 17:37:15.931440 6708 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 17:37:15.931460 6708 factory.go:656] Stopping watch factory\\\\nI0227 17:37:15.931484 6708 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 17:37:15.931492 6708 ovnkube.go:599] Stopped ovnkube\\\\nI0227 17:37:15.931504 6708 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0227 17:37:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73c06a667cc5d076b716fa5bcf723d492ff26bc1bf27874e1f7fea7e5d7e7696\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T17:37:17Z\\\",\\\"message\\\":\\\"tring]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0227 17:37:17.558333 6884 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0227 17:37:17.558338 6884 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF0227 17:37:17.558323 6884 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:17Z is after 2025-08-24T17:21:41Z]\\\\nI0227 17:37:17.558085 6884 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sfztq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:17Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:17 crc kubenswrapper[4752]: I0227 17:37:17.702040 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jlxqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aad5923-f151-43de-a1a0-b8c6906c2d7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49f03929dd1e6f1510d46093e8ae9214eefc7c04d1db680cd69fde52c9344298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srhxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jlxqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:17Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:17 crc kubenswrapper[4752]: I0227 17:37:17.724842 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntzss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e5e2ad1-375b-4340-a583-e32742e736e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://252ad8ab6d056c4b6aa5591b2eb7b61105a3e0ac19b98097f6bd5766b98c7f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://590b059f184b66a468123967bf7fe298308ae95e481a3e9d9489cbebc4034a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590b059f184b66a468123967bf7fe298308ae95e481a3e9d9489cbebc4034a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntzss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:17Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:17 crc kubenswrapper[4752]: I0227 17:37:17.740075 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:17Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:17 crc kubenswrapper[4752]: I0227 17:37:17.755265 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a772c7aa8838bea02d77323a30292917c5106470a965f45d100e33e60b3d8e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:17Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:17 crc kubenswrapper[4752]: I0227 17:37:17.772879 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:17Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:17 crc kubenswrapper[4752]: I0227 17:37:17.785131 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://345b6aa5640e3a483968ee51953b07e13323f9a6ffed421509dc7a806d526100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:17Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:17 crc kubenswrapper[4752]: I0227 17:37:17.797719 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jkjwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"937bbb35-a3c2-435c-86c5-1072f3a54595\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jkjwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:17Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:17 crc kubenswrapper[4752]: I0227 17:37:17.809475 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vlk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dc23b1ea3fe03171bff4757ee01e66418bad442ac3a3489002239e543dbd6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccstl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vlk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:17Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:17 crc kubenswrapper[4752]: I0227 17:37:17.825338 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2472139744ba3e03c75f0553bd60e1cb207dc09b8476cb53d54ba4f4490080e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ec71ef97366504dfbfefb8cb3f687f41639740074ce3199b54d15ee328e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:17Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:17 crc kubenswrapper[4752]: I0227 17:37:17.844726 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:17Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:17 crc kubenswrapper[4752]: I0227 17:37:17.858251 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d3acd38b3425ce4a5d9befc6cc4e74b0192874e4bc99516544c8502e71be3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98009a8a13b1dfd8f9ceb1385dc6e4c9e209083771716793c0822fe287e5dfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jpsg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:17Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:17 crc kubenswrapper[4752]: I0227 17:37:17.906755 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:37:17 crc kubenswrapper[4752]: I0227 17:37:17.906772 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:17 crc kubenswrapper[4752]: I0227 17:37:17.906986 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:37:17 crc kubenswrapper[4752]: E0227 17:37:17.907285 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:37:17 crc kubenswrapper[4752]: I0227 17:37:17.907358 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:37:17 crc kubenswrapper[4752]: E0227 17:37:17.907455 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:37:17 crc kubenswrapper[4752]: E0227 17:37:17.907715 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:37:17 crc kubenswrapper[4752]: E0227 17:37:17.907957 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:37:18 crc kubenswrapper[4752]: I0227 17:37:18.636730 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sfztq_690b0de6-1f38-4265-bfff-2077a349f89c/ovnkube-controller/1.log" Feb 27 17:37:19 crc kubenswrapper[4752]: I0227 17:37:19.524349 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:37:19 crc kubenswrapper[4752]: I0227 17:37:19.524396 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:37:19 crc kubenswrapper[4752]: I0227 17:37:19.524414 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:37:19 crc kubenswrapper[4752]: I0227 17:37:19.524438 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:37:19 crc kubenswrapper[4752]: I0227 17:37:19.524457 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:37:19Z","lastTransitionTime":"2026-02-27T17:37:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:37:19 crc kubenswrapper[4752]: E0227 17:37:19.541539 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:19Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:19 crc kubenswrapper[4752]: I0227 17:37:19.545983 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:37:19 crc kubenswrapper[4752]: I0227 17:37:19.546055 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:37:19 crc kubenswrapper[4752]: I0227 17:37:19.546075 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:37:19 crc kubenswrapper[4752]: I0227 17:37:19.546101 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:37:19 crc kubenswrapper[4752]: I0227 17:37:19.546119 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:37:19Z","lastTransitionTime":"2026-02-27T17:37:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:37:19 crc kubenswrapper[4752]: E0227 17:37:19.566909 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:19Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:19 crc kubenswrapper[4752]: I0227 17:37:19.572020 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:37:19 crc kubenswrapper[4752]: I0227 17:37:19.572113 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:37:19 crc kubenswrapper[4752]: I0227 17:37:19.572190 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:37:19 crc kubenswrapper[4752]: I0227 17:37:19.572230 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:37:19 crc kubenswrapper[4752]: I0227 17:37:19.572255 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:37:19Z","lastTransitionTime":"2026-02-27T17:37:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:37:19 crc kubenswrapper[4752]: E0227 17:37:19.594050 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:19Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:19 crc kubenswrapper[4752]: I0227 17:37:19.599426 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:37:19 crc kubenswrapper[4752]: I0227 17:37:19.599476 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:37:19 crc kubenswrapper[4752]: I0227 17:37:19.599494 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:37:19 crc kubenswrapper[4752]: I0227 17:37:19.599523 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:37:19 crc kubenswrapper[4752]: I0227 17:37:19.599542 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:37:19Z","lastTransitionTime":"2026-02-27T17:37:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:37:19 crc kubenswrapper[4752]: E0227 17:37:19.618892 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:19Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:19 crc kubenswrapper[4752]: I0227 17:37:19.624387 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:37:19 crc kubenswrapper[4752]: I0227 17:37:19.624483 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:37:19 crc kubenswrapper[4752]: I0227 17:37:19.624499 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:37:19 crc kubenswrapper[4752]: I0227 17:37:19.624571 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:37:19 crc kubenswrapper[4752]: I0227 17:37:19.624585 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:37:19Z","lastTransitionTime":"2026-02-27T17:37:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:37:19 crc kubenswrapper[4752]: E0227 17:37:19.642825 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:19Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:19 crc kubenswrapper[4752]: E0227 17:37:19.643018 4752 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 17:37:19 crc kubenswrapper[4752]: I0227 17:37:19.905873 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:37:19 crc kubenswrapper[4752]: I0227 17:37:19.905984 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:19 crc kubenswrapper[4752]: I0227 17:37:19.905862 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:37:19 crc kubenswrapper[4752]: E0227 17:37:19.906211 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:37:19 crc kubenswrapper[4752]: E0227 17:37:19.906418 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:37:19 crc kubenswrapper[4752]: E0227 17:37:19.906548 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:37:19 crc kubenswrapper[4752]: I0227 17:37:19.908283 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:37:19 crc kubenswrapper[4752]: E0227 17:37:19.908535 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:37:20 crc kubenswrapper[4752]: I0227 17:37:20.932641 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://345b6aa5640e3a483968ee51953b07e13323f9a6ffed421509dc7a806d526100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:20Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:20 crc kubenswrapper[4752]: I0227 17:37:20.954054 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jkjwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"937bbb35-a3c2-435c-86c5-1072f3a54595\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jkjwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:20Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:20 crc kubenswrapper[4752]: I0227 17:37:20.969581 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vlk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dc23b1ea3fe03171bff4757ee01e66418bad442ac3a3489002239e543dbd6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccstl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vlk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:20Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:20 crc kubenswrapper[4752]: I0227 17:37:20.990762 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2472139744ba3e03c75f0553bd60e1cb207dc09b8476cb53d54ba4f4490080e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ec71ef97366504dfbfefb8cb3f687f41639740074ce3199b54d15ee328e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:20Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:21 crc kubenswrapper[4752]: I0227 17:37:21.014486 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:21Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:21 crc kubenswrapper[4752]: E0227 17:37:21.017268 4752 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 17:37:21 crc kubenswrapper[4752]: I0227 17:37:21.036756 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d3acd38b3425ce4a5d9befc6cc4e74b0192874e4bc99516544c8502e71be3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98009a8a13b1dfd8f9ceb1385dc6e4c9e209083771716793c0822fe287e5dfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jpsg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:21Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:21 crc kubenswrapper[4752]: I0227 17:37:21.055444 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1024425-74cb-401d-961a-72058a77a919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://269b9e60d7588f2345dfd731d886ef405b3ccb49e23ae2e676a3fa354a0c9d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba92a9228c8523b28a4a03f1f859339e21eac77405f37f5206e37ce4ee45d381\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9dd98b133cbae7c1dfa12a34dd6973dd4a5fc4340d34c1c4b3296d036456a82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T17:36:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 17:36:53.508646 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 17:36:53.508779 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 17:36:53.509492 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1825321194/tls.crt::/tmp/serving-cert-1825321194/tls.key\\\\\\\"\\\\nI0227 17:36:53.902105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 17:36:53.905975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 17:36:53.906023 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 17:36:53.906078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 17:36:53.906089 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 17:36:53.913727 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0227 17:36:53.913754 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913759 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913764 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 17:36:53.913768 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 17:36:53.913771 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 17:36:53.913774 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0227 17:36:53.913764 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0227 17:36:53.916552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c1fe8d41bdc04d8780aca50265c0e6cf43abbb1cda58aae8f78d54d7c52d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:21Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:21 crc kubenswrapper[4752]: I0227 17:37:21.080932 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"690b0de6-1f38-4265-bfff-2077a349f89c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c06a667cc5d076b716fa5bcf723d492ff26bc1bf27874e1f7fea7e5d7e7696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d7ef152ac9a97f6fb9d6e3fe0a8f3b241e2ab95a48addc8ce017b8363a6057\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T17:37:15Z\\\",\\\"message\\\":\\\"8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 17:37:15.931246 6708 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0227 17:37:15.931295 6708 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0227 17:37:15.931318 6708 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 17:37:15.931336 6708 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 17:37:15.931361 6708 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 17:37:15.931378 6708 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 17:37:15.931391 6708 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 17:37:15.931391 6708 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 17:37:15.931421 6708 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0227 17:37:15.931442 6708 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 17:37:15.931440 6708 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 17:37:15.931460 6708 factory.go:656] Stopping watch factory\\\\nI0227 17:37:15.931484 6708 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 17:37:15.931492 6708 ovnkube.go:599] Stopped ovnkube\\\\nI0227 17:37:15.931504 6708 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0227 17:37:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73c06a667cc5d076b716fa5bcf723d492ff26bc1bf27874e1f7fea7e5d7e7696\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T17:37:17Z\\\",\\\"message\\\":\\\"tring]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0227 17:37:17.558333 6884 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0227 17:37:17.558338 6884 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF0227 17:37:17.558323 6884 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:17Z is after 2025-08-24T17:21:41Z]\\\\nI0227 17:37:17.558085 6884 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sfztq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:21Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:21 crc kubenswrapper[4752]: I0227 17:37:21.091139 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jlxqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aad5923-f151-43de-a1a0-b8c6906c2d7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49f03929dd1e6f1510d46093e8ae9214eefc7c04d1db680cd69fde52c9344298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srhxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jlxqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:21Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:21 crc kubenswrapper[4752]: I0227 17:37:21.102304 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ce186c-640f-4ade-94e1-587c1440fe87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b5f36c8fa728c5c34b846e1b603069f07726a5213fdf0bf0ae8b6b12b676b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a78e5a0164b37185d7cf25f07c0ea5a1fd6df0d23a1ff173bf085817d2f672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cm8wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:21Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:21 crc kubenswrapper[4752]: I0227 17:37:21.118235 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qpbx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca0c39841636b80e8872853f9f5695cffa06ce37002e2eb03206a41a5b7be1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f5dqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qpbx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:21Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:21 crc kubenswrapper[4752]: I0227 17:37:21.131386 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:21Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:21 crc kubenswrapper[4752]: I0227 17:37:21.144029 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a772c7aa8838bea02d77323a30292917c5106470a965f45d100e33e60b3d8e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:21Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:21 crc kubenswrapper[4752]: I0227 17:37:21.156367 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:21Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:21 crc kubenswrapper[4752]: I0227 17:37:21.177271 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntzss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e5e2ad1-375b-4340-a583-e32742e736e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://252ad8ab6d056c4b6aa5591b2eb7b61105a3e0ac19b98097f6bd5766b98c7f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://590b059f184b66a468123967bf7fe298308ae95e481a3e9d9489cbebc4034a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590b059f184b66a468123967bf7fe298308ae95e481a3e9d9489cbebc4034a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntzss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:21Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:21 crc kubenswrapper[4752]: I0227 17:37:21.810332 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:37:21 crc kubenswrapper[4752]: I0227 17:37:21.810523 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:37:21 crc kubenswrapper[4752]: E0227 17:37:21.810611 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:37:37.810574687 +0000 UTC m=+157.717391578 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:37:21 crc kubenswrapper[4752]: I0227 17:37:21.810690 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:21 crc kubenswrapper[4752]: I0227 17:37:21.810744 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:21 crc kubenswrapper[4752]: E0227 17:37:21.810820 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 17:37:21 crc kubenswrapper[4752]: E0227 17:37:21.810884 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 17:37:21 crc kubenswrapper[4752]: E0227 17:37:21.810907 4752 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 17:37:21 crc kubenswrapper[4752]: E0227 17:37:21.810957 4752 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 17:37:21 crc kubenswrapper[4752]: E0227 17:37:21.811013 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-27 17:37:37.810981017 +0000 UTC m=+157.717797898 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 17:37:21 crc kubenswrapper[4752]: E0227 17:37:21.811048 4752 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 17:37:21 crc kubenswrapper[4752]: E0227 17:37:21.811054 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 17:37:37.811040309 +0000 UTC m=+157.717857190 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 17:37:21 crc kubenswrapper[4752]: E0227 17:37:21.811111 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 17:37:37.81109392 +0000 UTC m=+157.717910801 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 17:37:21 crc kubenswrapper[4752]: I0227 17:37:21.905886 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:37:21 crc kubenswrapper[4752]: I0227 17:37:21.905978 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:21 crc kubenswrapper[4752]: I0227 17:37:21.906049 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:37:21 crc kubenswrapper[4752]: I0227 17:37:21.906631 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:37:21 crc kubenswrapper[4752]: E0227 17:37:21.906858 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:37:21 crc kubenswrapper[4752]: E0227 17:37:21.907130 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:37:21 crc kubenswrapper[4752]: E0227 17:37:21.907351 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:37:21 crc kubenswrapper[4752]: E0227 17:37:21.907585 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:37:21 crc kubenswrapper[4752]: I0227 17:37:21.911890 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/937bbb35-a3c2-435c-86c5-1072f3a54595-metrics-certs\") pod \"network-metrics-daemon-jkjwj\" (UID: \"937bbb35-a3c2-435c-86c5-1072f3a54595\") " pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:37:21 crc kubenswrapper[4752]: I0227 17:37:21.912014 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:37:21 crc kubenswrapper[4752]: E0227 17:37:21.912139 4752 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 17:37:21 crc kubenswrapper[4752]: E0227 17:37:21.912223 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 17:37:21 crc kubenswrapper[4752]: E0227 17:37:21.912261 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 17:37:21 crc kubenswrapper[4752]: E0227 17:37:21.912285 4752 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 17:37:21 crc kubenswrapper[4752]: E0227 17:37:21.912304 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/937bbb35-a3c2-435c-86c5-1072f3a54595-metrics-certs podName:937bbb35-a3c2-435c-86c5-1072f3a54595 nodeName:}" failed. No retries permitted until 2026-02-27 17:37:37.912265888 +0000 UTC m=+157.819082909 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/937bbb35-a3c2-435c-86c5-1072f3a54595-metrics-certs") pod "network-metrics-daemon-jkjwj" (UID: "937bbb35-a3c2-435c-86c5-1072f3a54595") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 17:37:21 crc kubenswrapper[4752]: E0227 17:37:21.912357 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-27 17:37:37.91233449 +0000 UTC m=+157.819151511 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 17:37:22 crc kubenswrapper[4752]: I0227 17:37:22.906899 4752 scope.go:117] "RemoveContainer" containerID="847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0" Feb 27 17:37:22 crc kubenswrapper[4752]: E0227 17:37:22.907350 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 17:37:23 crc kubenswrapper[4752]: I0227 17:37:23.905824 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:37:23 crc kubenswrapper[4752]: I0227 17:37:23.905890 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:23 crc kubenswrapper[4752]: I0227 17:37:23.905894 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:37:23 crc kubenswrapper[4752]: E0227 17:37:23.906013 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:37:23 crc kubenswrapper[4752]: I0227 17:37:23.906117 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:37:23 crc kubenswrapper[4752]: E0227 17:37:23.906342 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:37:23 crc kubenswrapper[4752]: E0227 17:37:23.906487 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:37:23 crc kubenswrapper[4752]: E0227 17:37:23.906639 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:37:24 crc kubenswrapper[4752]: I0227 17:37:24.922561 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 27 17:37:25 crc kubenswrapper[4752]: I0227 17:37:25.906623 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:37:25 crc kubenswrapper[4752]: I0227 17:37:25.906785 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:25 crc kubenswrapper[4752]: I0227 17:37:25.906890 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:37:25 crc kubenswrapper[4752]: E0227 17:37:25.906957 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:37:25 crc kubenswrapper[4752]: E0227 17:37:25.907106 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:37:25 crc kubenswrapper[4752]: E0227 17:37:25.907267 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:37:25 crc kubenswrapper[4752]: I0227 17:37:25.907294 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:37:25 crc kubenswrapper[4752]: E0227 17:37:25.907537 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:37:26 crc kubenswrapper[4752]: E0227 17:37:26.018760 4752 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 17:37:27 crc kubenswrapper[4752]: I0227 17:37:27.906015 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:37:27 crc kubenswrapper[4752]: I0227 17:37:27.906082 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:37:27 crc kubenswrapper[4752]: I0227 17:37:27.906120 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:37:27 crc kubenswrapper[4752]: I0227 17:37:27.906030 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:27 crc kubenswrapper[4752]: E0227 17:37:27.906319 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:37:27 crc kubenswrapper[4752]: E0227 17:37:27.906497 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:37:27 crc kubenswrapper[4752]: E0227 17:37:27.907015 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:37:27 crc kubenswrapper[4752]: E0227 17:37:27.907211 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:37:27 crc kubenswrapper[4752]: I0227 17:37:27.924863 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 27 17:37:29 crc kubenswrapper[4752]: I0227 17:37:29.905931 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:37:29 crc kubenswrapper[4752]: I0227 17:37:29.905968 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:29 crc kubenswrapper[4752]: E0227 17:37:29.907385 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:37:29 crc kubenswrapper[4752]: I0227 17:37:29.906122 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:37:29 crc kubenswrapper[4752]: E0227 17:37:29.907808 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:37:29 crc kubenswrapper[4752]: E0227 17:37:29.907400 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:37:29 crc kubenswrapper[4752]: I0227 17:37:29.906011 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:37:29 crc kubenswrapper[4752]: E0227 17:37:29.908402 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:37:30 crc kubenswrapper[4752]: I0227 17:37:30.005084 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:37:30 crc kubenswrapper[4752]: I0227 17:37:30.005454 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:37:30 crc kubenswrapper[4752]: I0227 17:37:30.005607 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:37:30 crc kubenswrapper[4752]: I0227 17:37:30.005751 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:37:30 crc kubenswrapper[4752]: I0227 17:37:30.005898 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:37:30Z","lastTransitionTime":"2026-02-27T17:37:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:37:30 crc kubenswrapper[4752]: E0227 17:37:30.028667 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:30Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:30 crc kubenswrapper[4752]: I0227 17:37:30.034777 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:37:30 crc kubenswrapper[4752]: I0227 17:37:30.034856 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:37:30 crc kubenswrapper[4752]: I0227 17:37:30.034880 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:37:30 crc kubenswrapper[4752]: I0227 17:37:30.034911 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:37:30 crc kubenswrapper[4752]: I0227 17:37:30.034934 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:37:30Z","lastTransitionTime":"2026-02-27T17:37:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:37:30 crc kubenswrapper[4752]: E0227 17:37:30.056777 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:30Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:30 crc kubenswrapper[4752]: I0227 17:37:30.062568 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:37:30 crc kubenswrapper[4752]: I0227 17:37:30.062649 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:37:30 crc kubenswrapper[4752]: I0227 17:37:30.062675 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:37:30 crc kubenswrapper[4752]: I0227 17:37:30.062707 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:37:30 crc kubenswrapper[4752]: I0227 17:37:30.062732 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:37:30Z","lastTransitionTime":"2026-02-27T17:37:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:37:30 crc kubenswrapper[4752]: E0227 17:37:30.083846 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:30Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:30 crc kubenswrapper[4752]: I0227 17:37:30.089992 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:37:30 crc kubenswrapper[4752]: I0227 17:37:30.090051 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:37:30 crc kubenswrapper[4752]: I0227 17:37:30.090078 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:37:30 crc kubenswrapper[4752]: I0227 17:37:30.090112 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:37:30 crc kubenswrapper[4752]: I0227 17:37:30.090140 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:37:30Z","lastTransitionTime":"2026-02-27T17:37:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:37:30 crc kubenswrapper[4752]: E0227 17:37:30.112491 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:30Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:30 crc kubenswrapper[4752]: I0227 17:37:30.118006 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:37:30 crc kubenswrapper[4752]: I0227 17:37:30.118069 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:37:30 crc kubenswrapper[4752]: I0227 17:37:30.118095 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:37:30 crc kubenswrapper[4752]: I0227 17:37:30.118123 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:37:30 crc kubenswrapper[4752]: I0227 17:37:30.118178 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:37:30Z","lastTransitionTime":"2026-02-27T17:37:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:37:30 crc kubenswrapper[4752]: E0227 17:37:30.139219 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:30Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:30 crc kubenswrapper[4752]: E0227 17:37:30.139441 4752 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 17:37:30 crc kubenswrapper[4752]: I0227 17:37:30.927931 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ce186c-640f-4ade-94e1-587c1440fe87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b5f36c8fa728c5c34b846e1b603069f07726a5213fdf0bf0ae8b6b12b676b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a78e5a0164b37185d7cf25f07c0ea5a1fd6df0d23a1ff173bf085817d2f672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cm8wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:30Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:30 crc kubenswrapper[4752]: I0227 17:37:30.942278 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qpbx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca0c39841636b80e8872853f9f5695cffa06ce37002e2eb03206a41a5b7be1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f5dqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qpbx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:30Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:30 crc kubenswrapper[4752]: I0227 17:37:30.955497 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1024425-74cb-401d-961a-72058a77a919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://269b9e60d7588f2345dfd731d886ef405b3ccb49e23ae2e676a3fa354a0c9d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba92a9228c8523b28a4a03f1f859339e21eac77405f37f5206e37ce4ee45d381\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9dd98b133cbae7c1dfa12a34dd6973dd4a5fc4340d34c1c4b3296d036456a82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T17:36:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 17:36:53.508646 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 17:36:53.508779 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 17:36:53.509492 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1825321194/tls.crt::/tmp/serving-cert-1825321194/tls.key\\\\\\\"\\\\nI0227 17:36:53.902105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 17:36:53.905975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 17:36:53.906023 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 17:36:53.906078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 17:36:53.906089 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 17:36:53.913727 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0227 17:36:53.913754 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913759 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913764 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 17:36:53.913768 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 17:36:53.913771 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 17:36:53.913774 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0227 17:36:53.913764 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0227 17:36:53.916552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c1fe8d41bdc04d8780aca50265c0e6cf43abbb1cda58aae8f78d54d7c52d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:30Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:30 crc kubenswrapper[4752]: I0227 17:37:30.981628 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"690b0de6-1f38-4265-bfff-2077a349f89c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c06a667cc5d076b716fa5bcf723d492ff26bc1bf27874e1f7fea7e5d7e7696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d7ef152ac9a97f6fb9d6e3fe0a8f3b241e2ab95a48addc8ce017b8363a6057\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T17:37:15Z\\\",\\\"message\\\":\\\"8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0227 17:37:15.931246 6708 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0227 17:37:15.931295 6708 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0227 17:37:15.931318 6708 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 17:37:15.931336 6708 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 17:37:15.931361 6708 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 17:37:15.931378 6708 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 17:37:15.931391 6708 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 17:37:15.931391 6708 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 17:37:15.931421 6708 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0227 17:37:15.931442 6708 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 17:37:15.931440 6708 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 17:37:15.931460 6708 factory.go:656] Stopping watch factory\\\\nI0227 17:37:15.931484 6708 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 17:37:15.931492 6708 ovnkube.go:599] Stopped ovnkube\\\\nI0227 17:37:15.931504 6708 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0227 17:37:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73c06a667cc5d076b716fa5bcf723d492ff26bc1bf27874e1f7fea7e5d7e7696\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T17:37:17Z\\\",\\\"message\\\":\\\"tring]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0227 17:37:17.558333 6884 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0227 17:37:17.558338 6884 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF0227 17:37:17.558323 6884 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:17Z is after 2025-08-24T17:21:41Z]\\\\nI0227 17:37:17.558085 6884 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sfztq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:30Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:30 crc kubenswrapper[4752]: I0227 17:37:30.998964 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jlxqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aad5923-f151-43de-a1a0-b8c6906c2d7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49f03929dd1e6f1510d46093e8ae9214eefc7c04d1db680cd69fde52c9344298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srhxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jlxqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:30Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:31 crc kubenswrapper[4752]: E0227 17:37:31.019434 4752 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 17:37:31 crc kubenswrapper[4752]: I0227 17:37:31.020348 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntzss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e5e2ad1-375b-4340-a583-e32742e736e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://252ad8ab6d056c4b6aa5591b2eb7b61105a3e0ac19b98097f6bd5766b98c7f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://590b059f184b66a468123967bf7fe298308ae95e481a3e9d9489cbebc4034a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590b059f184b66a468123967bf7fe298308ae95e481a3e9d9489cbebc4034a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntzss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:31Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:31 crc kubenswrapper[4752]: I0227 17:37:31.037226 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:31Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:31 crc kubenswrapper[4752]: I0227 17:37:31.052417 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a772c7aa8838bea02d77323a30292917c5106470a965f45d100e33e60b3d8e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:31Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:31 crc kubenswrapper[4752]: I0227 17:37:31.073066 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:31Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:31 crc kubenswrapper[4752]: I0227 17:37:31.089816 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c7d88c1-f57d-4cd6-84b6-0d3a6d61366b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36a90f4c7c18fe68ad4aabb794bb89a722ba0e79f0d577e43cd4783b88781de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baf0bb2bec21350587defabe8172bf2a438de43ad4101a7ec15ea515e7a2624d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T17:35:33Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 17:35:03.074302 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 17:35:03.076341 1 observer_polling.go:159] Starting file observer\\\\nI0227 17:35:03.114487 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 17:35:03.117744 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 17:35:33.288292 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 17:35:33.288564 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:32Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07158ca3ff468fe4b8147ba5f4a6ec9e775d214521e3d5a99d9e7dde15e760a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ffbf0795b45df43efea5060fcbdef64063b360d6841281eb78a5179c2f62ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc81955110624a62223bd05aebbad30eb6a8467b33de77a99daacdd0f6ab7f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:31Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:31 crc kubenswrapper[4752]: I0227 17:37:31.110660 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://345b6aa5640e3a483968ee51953b07e13323f9a6ffed421509dc7a806d526100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:31Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:31 crc kubenswrapper[4752]: I0227 17:37:31.126485 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jkjwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"937bbb35-a3c2-435c-86c5-1072f3a54595\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jkjwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:31Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:31 crc kubenswrapper[4752]: I0227 17:37:31.145447 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vlk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dc23b1ea3fe03171bff4757ee01e66418bad442ac3a3489002239e543dbd6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccstl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vlk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:31Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:31 crc kubenswrapper[4752]: I0227 17:37:31.167377 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82c5f928-ac2b-4288-ac82-5ff668d4212b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b99ad96105738ae86e79d21cbf8ce43e85d7410e44e757f70bf0ca3da252fa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1978f9a878a01d29db332bc246fdf21a24d355bcd8c8580cb3a72169b68278bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1978f9a878a01d29db332bc246fdf21a24d355bcd8c8580cb3a72169b68278bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:31Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:31 crc kubenswrapper[4752]: I0227 17:37:31.187810 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2472139744ba3e03c75f0553bd60e1cb207dc09b8476cb53d54ba4f4490080e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ec71ef97366504dfbfefb8cb3f687f41639740074ce3199b54d15ee328e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:31Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:31 crc kubenswrapper[4752]: I0227 17:37:31.212030 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:31Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:31 crc kubenswrapper[4752]: I0227 17:37:31.231013 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d3acd38b3425ce4a5d9befc6cc4e74b0192874e4bc99516544c8502e71be3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98009a8a13b1dfd8f9ceb1385dc6e4c9e209083771716793c0822fe287e5dfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jpsg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:31Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:31 crc kubenswrapper[4752]: I0227 17:37:31.906220 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:37:31 crc kubenswrapper[4752]: I0227 17:37:31.906277 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:31 crc kubenswrapper[4752]: I0227 17:37:31.906335 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:37:31 crc kubenswrapper[4752]: E0227 17:37:31.906408 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:37:31 crc kubenswrapper[4752]: I0227 17:37:31.906345 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:37:31 crc kubenswrapper[4752]: E0227 17:37:31.906951 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:37:31 crc kubenswrapper[4752]: E0227 17:37:31.907056 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:37:31 crc kubenswrapper[4752]: E0227 17:37:31.907183 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:37:31 crc kubenswrapper[4752]: I0227 17:37:31.907504 4752 scope.go:117] "RemoveContainer" containerID="73c06a667cc5d076b716fa5bcf723d492ff26bc1bf27874e1f7fea7e5d7e7696" Feb 27 17:37:31 crc kubenswrapper[4752]: I0227 17:37:31.929254 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82c5f928-ac2b-4288-ac82-5ff668d4212b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b99ad96105738ae86e79d21cbf8ce43e85d7410e44e757f70bf0ca3da252fa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1978f9a878a01d29db332bc246fdf21a24d355bcd8c8580cb3a72169b68278bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1978f9a878a01d29db332bc246fdf21a24d355bcd8c8580cb3a72169b68278bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:31Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:31 crc kubenswrapper[4752]: I0227 17:37:31.953591 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2472139744ba3e03c75f0553bd60e1cb207dc09b8476cb53d54ba4f4490080e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ec71ef97366504dfbfefb8cb3f687f41639740074ce3199b54d15ee328e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:31Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:31 crc kubenswrapper[4752]: I0227 17:37:31.978197 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:31Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:31 crc kubenswrapper[4752]: I0227 17:37:31.996561 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d3acd38b3425ce4a5d9befc6cc4e74b0192874e4bc99516544c8502e71be3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98009a8a13b1dfd8f9ceb1385dc6e4c9e209083771716793c0822fe287e5dfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jpsg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:31Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:32 crc kubenswrapper[4752]: I0227 17:37:32.013207 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1024425-74cb-401d-961a-72058a77a919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://269b9e60d7588f2345dfd731d886ef405b3ccb49e23ae2e676a3fa354a0c9d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba92a9228c8523b28a4a03f1f859339e21eac77405f37f5206e37ce4ee45d381\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9dd98b133cbae7c1dfa12a34dd6973dd4a5fc4340d34c1c4b3296d036456a82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T17:36:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 17:36:53.508646 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 17:36:53.508779 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 17:36:53.509492 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1825321194/tls.crt::/tmp/serving-cert-1825321194/tls.key\\\\\\\"\\\\nI0227 17:36:53.902105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 17:36:53.905975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 17:36:53.906023 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 17:36:53.906078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 17:36:53.906089 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 17:36:53.913727 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0227 17:36:53.913754 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913759 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913764 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 17:36:53.913768 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 17:36:53.913771 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 17:36:53.913774 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0227 17:36:53.913764 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0227 17:36:53.916552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c1fe8d41bdc04d8780aca50265c0e6cf43abbb1cda58aae8f78d54d7c52d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:32Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:32 crc kubenswrapper[4752]: I0227 17:37:32.044569 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"690b0de6-1f38-4265-bfff-2077a349f89c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c06a667cc5d076b716fa5bcf723d492ff26bc1bf27874e1f7fea7e5d7e7696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73c06a667cc5d076b716fa5bcf723d492ff26bc1bf27874e1f7fea7e5d7e7696\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T17:37:17Z\\\",\\\"message\\\":\\\"tring]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0227 17:37:17.558333 6884 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0227 17:37:17.558338 6884 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF0227 17:37:17.558323 6884 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:17Z is after 2025-08-24T17:21:41Z]\\\\nI0227 17:37:17.558085 6884 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-sfztq_openshift-ovn-kubernetes(690b0de6-1f38-4265-bfff-2077a349f89c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sfztq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:32Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:32 crc kubenswrapper[4752]: I0227 17:37:32.057865 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jlxqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aad5923-f151-43de-a1a0-b8c6906c2d7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49f03929dd1e6f1510d46093e8ae9214eefc7c04d1db680cd69fde52c9344298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srhxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jlxqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:32Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:32 crc kubenswrapper[4752]: I0227 17:37:32.073557 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ce186c-640f-4ade-94e1-587c1440fe87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b5f36c8fa728c5c34b846e1b603069f07726a5213fdf0bf0ae8b6b12b676b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a78e5a0164b37185d7cf25f07c0ea5a1fd6df0d23a1ff173bf085817d2f672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cm8wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:32Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:32 crc kubenswrapper[4752]: I0227 17:37:32.087641 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qpbx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca0c39841636b80e8872853f9f5695cffa06ce37002e2eb03206a41a5b7be1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f5dqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qpbx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:32Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:32 crc kubenswrapper[4752]: I0227 17:37:32.101812 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:32Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:32 crc kubenswrapper[4752]: I0227 17:37:32.114925 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a772c7aa8838bea02d77323a30292917c5106470a965f45d100e33e60b3d8e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:32Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:32 crc kubenswrapper[4752]: I0227 17:37:32.126059 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:32Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:32 crc kubenswrapper[4752]: I0227 17:37:32.140446 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntzss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e5e2ad1-375b-4340-a583-e32742e736e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://252ad8ab6d056c4b6aa5591b2eb7b61105a3e0ac19b98097f6bd5766b98c7f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://590b059f184b66a468123967bf7fe298308ae95e481a3e9d9489cbebc4034a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590b059f184b66a468123967bf7fe298308ae95e481a3e9d9489cbebc4034a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntzss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:32Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:32 crc kubenswrapper[4752]: I0227 17:37:32.152799 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c7d88c1-f57d-4cd6-84b6-0d3a6d61366b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36a90f4c7c18fe68ad4aabb794bb89a722ba0e79f0d577e43cd4783b88781de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baf0bb2bec21350587defabe8172bf2a438de43ad4101a7ec15ea515e7a2624d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T17:35:33Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 17:35:03.074302 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 17:35:03.076341 1 observer_polling.go:159] Starting file observer\\\\nI0227 17:35:03.114487 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 17:35:03.117744 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 17:35:33.288292 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 17:35:33.288564 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:32Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07158ca3ff468fe4b8147ba5f4a6ec9e775d214521e3d5a99d9e7dde15e760a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ffbf0795b45df43efea5060fcbdef64063b360d6841281eb78a5179c2f62ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc81955110624a62223bd05aebbad30eb6a8467b33de77a99daacdd0f6ab7f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:32Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:32 crc kubenswrapper[4752]: I0227 17:37:32.165810 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://345b6aa5640e3a483968ee51953b07e13323f9a6ffed421509dc7a806d526100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:32Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:32 crc kubenswrapper[4752]: I0227 17:37:32.176118 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jkjwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"937bbb35-a3c2-435c-86c5-1072f3a54595\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jkjwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:32Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:32 crc kubenswrapper[4752]: I0227 17:37:32.186900 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vlk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dc23b1ea3fe03171bff4757ee01e66418bad442ac3a3489002239e543dbd6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccstl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vlk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:32Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:32 crc kubenswrapper[4752]: I0227 17:37:32.695289 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sfztq_690b0de6-1f38-4265-bfff-2077a349f89c/ovnkube-controller/1.log" Feb 27 17:37:32 crc kubenswrapper[4752]: I0227 17:37:32.698413 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" event={"ID":"690b0de6-1f38-4265-bfff-2077a349f89c","Type":"ContainerStarted","Data":"b6226881bade67276c9d38e1bf58dee9b4a9d7037b749ded5c707f607cd89951"} Feb 27 17:37:32 crc kubenswrapper[4752]: I0227 17:37:32.698760 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:37:32 crc kubenswrapper[4752]: I0227 17:37:32.710935 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82c5f928-ac2b-4288-ac82-5ff668d4212b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b99ad96105738ae86e79d21cbf8ce43e85d7410e44e757f70bf0ca3da252fa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1978f9a878a01d29db332bc246fdf21a24d355bcd8c8580cb3a72169b68278bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1978f9a878a01d29db332bc246fdf21a24d355bcd8c8580cb3a72169b68278bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:32Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:32 crc kubenswrapper[4752]: I0227 17:37:32.724710 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2472139744ba3e03c75f0553bd60e1cb207dc09b8476cb53d54ba4f4490080e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ec71ef97366504dfbfefb8cb3f687f41639740074ce3199b54d15ee328e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:32Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:32 crc kubenswrapper[4752]: I0227 17:37:32.738689 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:32Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:32 crc kubenswrapper[4752]: I0227 17:37:32.752864 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d3acd38b3425ce4a5d9befc6cc4e74b0192874e4bc99516544c8502e71be3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98009a8a13b1dfd8f9ceb1385dc6e4c9e209083771716793c0822fe287e5dfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jpsg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:32Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:32 crc kubenswrapper[4752]: I0227 17:37:32.774077 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1024425-74cb-401d-961a-72058a77a919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://269b9e60d7588f2345dfd731d886ef405b3ccb49e23ae2e676a3fa354a0c9d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba92a9228c8523b28a4a03f1f859339e21eac77405f37f5206e37ce4ee45d381\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9dd98b133cbae7c1dfa12a34dd6973dd4a5fc4340d34c1c4b3296d036456a82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T17:36:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 17:36:53.508646 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 17:36:53.508779 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 17:36:53.509492 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1825321194/tls.crt::/tmp/serving-cert-1825321194/tls.key\\\\\\\"\\\\nI0227 17:36:53.902105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 17:36:53.905975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 17:36:53.906023 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 17:36:53.906078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 17:36:53.906089 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 17:36:53.913727 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0227 17:36:53.913754 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913759 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913764 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 17:36:53.913768 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 17:36:53.913771 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 17:36:53.913774 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0227 17:36:53.913764 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0227 17:36:53.916552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c1fe8d41bdc04d8780aca50265c0e6cf43abbb1cda58aae8f78d54d7c52d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:32Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:32 crc kubenswrapper[4752]: I0227 17:37:32.803580 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"690b0de6-1f38-4265-bfff-2077a349f89c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6226881bade67276c9d38e1bf58dee9b4a9d7037b749ded5c707f607cd89951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73c06a667cc5d076b716fa5bcf723d492ff26bc1bf27874e1f7fea7e5d7e7696\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T17:37:17Z\\\",\\\"message\\\":\\\"tring]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0227 17:37:17.558333 6884 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0227 17:37:17.558338 6884 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF0227 17:37:17.558323 6884 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:17Z is after 2025-08-24T17:21:41Z]\\\\nI0227 17:37:17.558085 6884 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sfztq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:32Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:32 crc kubenswrapper[4752]: I0227 17:37:32.820652 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jlxqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aad5923-f151-43de-a1a0-b8c6906c2d7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49f03929dd1e6f1510d46093e8ae9214eefc7c04d1db680cd69fde52c9344298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srhxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jlxqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:32Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:32 crc kubenswrapper[4752]: I0227 17:37:32.831348 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ce186c-640f-4ade-94e1-587c1440fe87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b5f36c8fa728c5c34b846e1b603069f07726a5213fdf0bf0ae8b6b12b676b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a78e5a0164b37185d7cf25f07c0ea5a1fd6df0d23a1ff173bf085817d2f672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cm8wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:32Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:32 crc kubenswrapper[4752]: I0227 17:37:32.845654 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qpbx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca0c39841636b80e8872853f9f5695cffa06ce37002e2eb03206a41a5b7be1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f5dqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qpbx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:32Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:32 crc kubenswrapper[4752]: I0227 17:37:32.859130 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:32Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:32 crc kubenswrapper[4752]: I0227 17:37:32.874710 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a772c7aa8838bea02d77323a30292917c5106470a965f45d100e33e60b3d8e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:32Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:32 crc kubenswrapper[4752]: I0227 17:37:32.888268 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:32Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:32 crc kubenswrapper[4752]: I0227 17:37:32.902336 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntzss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e5e2ad1-375b-4340-a583-e32742e736e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://252ad8ab6d056c4b6aa5591b2eb7b61105a3e0ac19b98097f6bd5766b98c7f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://590b059f184b66a468123967bf7fe298308ae95e481a3e9d9489cbebc4034a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590b059f184b66a468123967bf7fe298308ae95e481a3e9d9489cbebc4034a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntzss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:32Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:32 crc kubenswrapper[4752]: I0227 17:37:32.922718 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c7d88c1-f57d-4cd6-84b6-0d3a6d61366b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36a90f4c7c18fe68ad4aabb794bb89a722ba0e79f0d577e43cd4783b88781de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baf0bb2bec21350587defabe8172bf2a438de43ad4101a7ec15ea515e7a2624d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T17:35:33Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 17:35:03.074302 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 17:35:03.076341 1 observer_polling.go:159] Starting file observer\\\\nI0227 17:35:03.114487 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 17:35:03.117744 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 17:35:33.288292 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 17:35:33.288564 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:32Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07158ca3ff468fe4b8147ba5f4a6ec9e775d214521e3d5a99d9e7dde15e760a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ffbf0795b45df43efea5060fcbdef64063b360d6841281eb78a5179c2f62ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc81955110624a62223bd05aebbad30eb6a8467b33de77a99daacdd0f6ab7f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:32Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:32 crc kubenswrapper[4752]: I0227 17:37:32.944097 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://345b6aa5640e3a483968ee51953b07e13323f9a6ffed421509dc7a806d526100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:32Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:32 crc kubenswrapper[4752]: I0227 17:37:32.958249 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jkjwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"937bbb35-a3c2-435c-86c5-1072f3a54595\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jkjwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:32Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:32 crc kubenswrapper[4752]: I0227 17:37:32.969220 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vlk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dc23b1ea3fe03171bff4757ee01e66418bad442ac3a3489002239e543dbd6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccstl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vlk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:32Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:33 crc kubenswrapper[4752]: I0227 17:37:33.705965 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sfztq_690b0de6-1f38-4265-bfff-2077a349f89c/ovnkube-controller/2.log" Feb 27 17:37:33 crc kubenswrapper[4752]: I0227 17:37:33.707314 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sfztq_690b0de6-1f38-4265-bfff-2077a349f89c/ovnkube-controller/1.log" Feb 27 17:37:33 crc kubenswrapper[4752]: I0227 17:37:33.712066 4752 generic.go:334] "Generic (PLEG): container finished" podID="690b0de6-1f38-4265-bfff-2077a349f89c" containerID="b6226881bade67276c9d38e1bf58dee9b4a9d7037b749ded5c707f607cd89951" exitCode=1 Feb 27 17:37:33 crc kubenswrapper[4752]: I0227 17:37:33.712141 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" event={"ID":"690b0de6-1f38-4265-bfff-2077a349f89c","Type":"ContainerDied","Data":"b6226881bade67276c9d38e1bf58dee9b4a9d7037b749ded5c707f607cd89951"} Feb 27 17:37:33 crc kubenswrapper[4752]: I0227 17:37:33.712275 4752 scope.go:117] "RemoveContainer" containerID="73c06a667cc5d076b716fa5bcf723d492ff26bc1bf27874e1f7fea7e5d7e7696" Feb 27 17:37:33 crc kubenswrapper[4752]: I0227 17:37:33.713911 4752 scope.go:117] "RemoveContainer" containerID="b6226881bade67276c9d38e1bf58dee9b4a9d7037b749ded5c707f607cd89951" Feb 27 17:37:33 crc kubenswrapper[4752]: E0227 17:37:33.714227 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sfztq_openshift-ovn-kubernetes(690b0de6-1f38-4265-bfff-2077a349f89c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" Feb 27 17:37:33 crc kubenswrapper[4752]: I0227 17:37:33.740997 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d3acd38b3425ce4a5d9befc6cc4e74b0192874e4bc99516544c8502e71be3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98009a8a13b1dfd8f9ceb1385dc6e4c9e209083771716793c0822fe287e5dfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jpsg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:33Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:33 crc kubenswrapper[4752]: I0227 17:37:33.761487 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82c5f928-ac2b-4288-ac82-5ff668d4212b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b99ad96105738ae86e79d21cbf8ce43e85d7410e44e757f70bf0ca3da252fa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1978f9a878a01d29db332bc246fdf21a24d355bcd8c8580cb3a72169b68278bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1978f9a878a01d29db332bc246fdf21a24d355bcd8c8580cb3a72169b68278bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:33Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:33 crc kubenswrapper[4752]: I0227 17:37:33.784878 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2472139744ba3e03c75f0553bd60e1cb207dc09b8476cb53d54ba4f4490080e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ec71ef97366504dfbfefb8cb3f687f41639740074ce3199b54d15ee328e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:33Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:33 crc kubenswrapper[4752]: I0227 17:37:33.807389 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:33Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:33 crc kubenswrapper[4752]: I0227 17:37:33.825487 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jlxqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aad5923-f151-43de-a1a0-b8c6906c2d7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49f03929dd1e6f1510d46093e8ae9214eefc7c04d1db680cd69fde52c9344298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srhxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jlxqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:33Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:33 crc kubenswrapper[4752]: I0227 17:37:33.843759 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ce186c-640f-4ade-94e1-587c1440fe87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b5f36c8fa728c5c34b846e1b603069f07726a5213fdf0bf0ae8b6b12b676b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a78e5a0164b37185d7cf25f07c0ea5a1fd6df0d23a1ff173bf085817d2f672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cm8wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:33Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:33 crc kubenswrapper[4752]: I0227 17:37:33.869421 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qpbx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca0c39841636b80e8872853f9f5695cffa06ce37002e2eb03206a41a5b7be1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f5dqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qpbx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:33Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:33 crc kubenswrapper[4752]: I0227 17:37:33.896181 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1024425-74cb-401d-961a-72058a77a919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://269b9e60d7588f2345dfd731d886ef405b3ccb49e23ae2e676a3fa354a0c9d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba92a9228c8523b28a4a03f1f859339e21eac77405f37f5206e37ce4ee45d381\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9dd98b133cbae7c1dfa12a34dd6973dd4a5fc4340d34c1c4b3296d036456a82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T17:36:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 17:36:53.508646 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 17:36:53.508779 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 17:36:53.509492 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1825321194/tls.crt::/tmp/serving-cert-1825321194/tls.key\\\\\\\"\\\\nI0227 17:36:53.902105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 17:36:53.905975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 17:36:53.906023 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 17:36:53.906078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 17:36:53.906089 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 17:36:53.913727 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0227 17:36:53.913754 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913759 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913764 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 17:36:53.913768 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 17:36:53.913771 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 17:36:53.913774 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0227 17:36:53.913764 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0227 17:36:53.916552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c1fe8d41bdc04d8780aca50265c0e6cf43abbb1cda58aae8f78d54d7c52d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:33Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:33 crc kubenswrapper[4752]: I0227 17:37:33.906650 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:37:33 crc kubenswrapper[4752]: I0227 17:37:33.906722 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:37:33 crc kubenswrapper[4752]: I0227 17:37:33.906723 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:33 crc kubenswrapper[4752]: I0227 17:37:33.906862 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:37:33 crc kubenswrapper[4752]: E0227 17:37:33.907090 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:37:33 crc kubenswrapper[4752]: E0227 17:37:33.907236 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:37:33 crc kubenswrapper[4752]: E0227 17:37:33.907452 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:37:33 crc kubenswrapper[4752]: E0227 17:37:33.907656 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:37:33 crc kubenswrapper[4752]: I0227 17:37:33.927113 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"690b0de6-1f38-4265-bfff-2077a349f89c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6226881bade67276c9d38e1bf58dee9b4a9d7037b749ded5c707f607cd89951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73c06a667cc5d076b716fa5bcf723d492ff26bc1bf27874e1f7fea7e5d7e7696\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T17:37:17Z\\\",\\\"message\\\":\\\"tring]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0227 17:37:17.558333 6884 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0227 17:37:17.558338 6884 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF0227 17:37:17.558323 6884 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:17Z is after 2025-08-24T17:21:41Z]\\\\nI0227 17:37:17.558085 6884 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6226881bade67276c9d38e1bf58dee9b4a9d7037b749ded5c707f607cd89951\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T17:37:32Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 17:37:32.860048 7066 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 17:37:32.860071 7066 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 17:37:32.860098 7066 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 17:37:32.860136 7066 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 17:37:32.860186 7066 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0227 17:37:32.860205 7066 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 17:37:32.860209 7066 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 17:37:32.860226 7066 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 17:37:32.860238 7066 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 17:37:32.860234 7066 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 17:37:32.860212 7066 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 17:37:32.860249 7066 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0227 17:37:32.860280 7066 factory.go:656] Stopping watch factory\\\\nI0227 17:37:32.860294 7066 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 17:37:32.860301 7066 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sfztq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:33Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:33 crc kubenswrapper[4752]: I0227 17:37:33.945289 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:33Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:33 crc kubenswrapper[4752]: I0227 17:37:33.968232 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntzss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e5e2ad1-375b-4340-a583-e32742e736e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://252ad8ab6d056c4b6aa5591b2eb7b61105a3e0ac19b98097f6bd5766b98c7f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://590b059f184b66a468123967bf7fe298308ae95e481a3e9d9489cbebc4034a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590b059f184b66a468123967bf7fe298308ae95e481a3e9d9489cbebc4034a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntzss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:33Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:33 crc kubenswrapper[4752]: I0227 17:37:33.989473 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:33Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:34 crc kubenswrapper[4752]: I0227 17:37:34.007985 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a772c7aa8838bea02d77323a30292917c5106470a965f45d100e33e60b3d8e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:34Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:34 crc kubenswrapper[4752]: I0227 17:37:34.023266 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vlk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dc23b1ea3fe03171bff4757ee01e66418bad442ac3a3489002239e543dbd6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccstl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vlk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:34Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:34 crc kubenswrapper[4752]: I0227 17:37:34.039299 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c7d88c1-f57d-4cd6-84b6-0d3a6d61366b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36a90f4c7c18fe68ad4aabb794bb89a722ba0e79f0d577e43cd4783b88781de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baf0bb2bec21350587defabe8172bf2a438de43ad4101a7ec15ea515e7a2624d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T17:35:33Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 17:35:03.074302 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 17:35:03.076341 1 observer_polling.go:159] Starting file observer\\\\nI0227 17:35:03.114487 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 17:35:03.117744 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 17:35:33.288292 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 17:35:33.288564 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:32Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07158ca3ff468fe4b8147ba5f4a6ec9e775d214521e3d5a99d9e7dde15e760a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ffbf0795b45df43efea5060fcbdef64063b360d6841281eb78a5179c2f62ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc81955110624a62223bd05aebbad30eb6a8467b33de77a99daacdd0f6ab7f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:34Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:34 crc kubenswrapper[4752]: I0227 17:37:34.059719 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://345b6aa5640e3a483968ee51953b07e13323f9a6ffed421509dc7a806d526100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:34Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:34 crc kubenswrapper[4752]: I0227 17:37:34.078670 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jkjwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"937bbb35-a3c2-435c-86c5-1072f3a54595\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jkjwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:34Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:34 crc kubenswrapper[4752]: I0227 17:37:34.720072 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sfztq_690b0de6-1f38-4265-bfff-2077a349f89c/ovnkube-controller/2.log" Feb 27 17:37:34 crc kubenswrapper[4752]: I0227 17:37:34.725031 4752 scope.go:117] "RemoveContainer" containerID="b6226881bade67276c9d38e1bf58dee9b4a9d7037b749ded5c707f607cd89951" Feb 27 17:37:34 crc kubenswrapper[4752]: E0227 17:37:34.725386 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sfztq_openshift-ovn-kubernetes(690b0de6-1f38-4265-bfff-2077a349f89c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" Feb 27 17:37:34 crc kubenswrapper[4752]: I0227 17:37:34.747744 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c7d88c1-f57d-4cd6-84b6-0d3a6d61366b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36a90f4c7c18fe68ad4aabb794bb89a722ba0e79f0d577e43cd4783b88781de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baf0bb2bec21350587defabe8172bf2a438de43ad4101a7ec15ea515e7a2624d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T17:35:33Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 17:35:03.074302 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 17:35:03.076341 1 observer_polling.go:159] Starting file observer\\\\nI0227 17:35:03.114487 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 17:35:03.117744 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 17:35:33.288292 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 17:35:33.288564 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:32Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07158ca3ff468fe4b8147ba5f4a6ec9e775d214521e3d5a99d9e7dde15e760a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ffbf0795b45df43efea5060fcbdef64063b360d6841281eb78a5179c2f62ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc81955110624a62223bd05aebbad30eb6a8467b33de77a99daacdd0f6ab7f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:34Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:34 crc kubenswrapper[4752]: I0227 17:37:34.770182 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://345b6aa5640e3a483968ee51953b07e13323f9a6ffed421509dc7a806d526100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:34Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:34 crc kubenswrapper[4752]: I0227 17:37:34.787214 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jkjwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"937bbb35-a3c2-435c-86c5-1072f3a54595\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jkjwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:34Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:34 crc kubenswrapper[4752]: I0227 17:37:34.802682 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vlk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dc23b1ea3fe03171bff4757ee01e66418bad442ac3a3489002239e543dbd6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccstl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vlk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:34Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:34 crc kubenswrapper[4752]: I0227 17:37:34.819453 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82c5f928-ac2b-4288-ac82-5ff668d4212b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b99ad96105738ae86e79d21cbf8ce43e85d7410e44e757f70bf0ca3da252fa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1978f9a878a01d29db332bc246fdf21a24d355bcd8c8580cb3a72169b68278bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1978f9a878a01d29db332bc246fdf21a24d355bcd8c8580cb3a72169b68278bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:34Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:34 crc kubenswrapper[4752]: I0227 17:37:34.837419 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2472139744ba3e03c75f0553bd60e1cb207dc09b8476cb53d54ba4f4490080e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ec71ef97366504dfbfefb8cb3f687f41639740074ce3199b54d15ee328e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:34Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:34 crc kubenswrapper[4752]: I0227 17:37:34.857254 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:34Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:34 crc kubenswrapper[4752]: I0227 17:37:34.872923 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d3acd38b3425ce4a5d9befc6cc4e74b0192874e4bc99516544c8502e71be3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98009a8a13b1dfd8f9ceb1385dc6e4c9e209083771716793c0822fe287e5dfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jpsg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:34Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:34 crc kubenswrapper[4752]: I0227 17:37:34.890969 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qpbx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca0c39841636b80e8872853f9f5695cffa06ce37002e2eb03206a41a5b7be1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f5dqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qpbx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:34Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:34 crc kubenswrapper[4752]: I0227 17:37:34.907729 4752 scope.go:117] "RemoveContainer" containerID="847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0" Feb 27 17:37:34 crc kubenswrapper[4752]: E0227 17:37:34.908058 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 17:37:34 crc kubenswrapper[4752]: I0227 17:37:34.911968 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1024425-74cb-401d-961a-72058a77a919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://269b9e60d7588f2345dfd731d886ef405b3ccb49e23ae2e676a3fa354a0c9d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba92a9228c8523b28a4a03f1f859339e21eac77405f37f5206e37ce4ee45d381\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9dd98b133cbae7c1dfa12a34dd6973dd4a5fc4340d34c1c4b3296d036456a82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T17:36:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 17:36:53.508646 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 17:36:53.508779 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 17:36:53.509492 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1825321194/tls.crt::/tmp/serving-cert-1825321194/tls.key\\\\\\\"\\\\nI0227 17:36:53.902105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 17:36:53.905975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 17:36:53.906023 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 17:36:53.906078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 17:36:53.906089 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 17:36:53.913727 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0227 17:36:53.913754 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913759 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913764 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 17:36:53.913768 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 17:36:53.913771 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 17:36:53.913774 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0227 17:36:53.913764 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0227 17:36:53.916552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c1fe8d41bdc04d8780aca50265c0e6cf43abbb1cda58aae8f78d54d7c52d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:34Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:34 crc kubenswrapper[4752]: I0227 17:37:34.944879 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"690b0de6-1f38-4265-bfff-2077a349f89c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6226881bade67276c9d38e1bf58dee9b4a9d7037b749ded5c707f607cd89951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6226881bade67276c9d38e1bf58dee9b4a9d7037b749ded5c707f607cd89951\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T17:37:32Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 17:37:32.860048 7066 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 17:37:32.860071 7066 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 17:37:32.860098 7066 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 17:37:32.860136 7066 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 17:37:32.860186 7066 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0227 17:37:32.860205 7066 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 17:37:32.860209 7066 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 17:37:32.860226 7066 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 17:37:32.860238 7066 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 17:37:32.860234 7066 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 17:37:32.860212 7066 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 17:37:32.860249 7066 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0227 17:37:32.860280 7066 factory.go:656] Stopping watch factory\\\\nI0227 17:37:32.860294 7066 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 17:37:32.860301 7066 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sfztq_openshift-ovn-kubernetes(690b0de6-1f38-4265-bfff-2077a349f89c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sfztq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:34Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:34 crc kubenswrapper[4752]: I0227 17:37:34.962042 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jlxqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aad5923-f151-43de-a1a0-b8c6906c2d7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49f03929dd1e6f1510d46093e8ae9214eefc7c04d1db680cd69fde52c9344298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srhxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jlxqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:34Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:34 crc kubenswrapper[4752]: I0227 17:37:34.981018 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ce186c-640f-4ade-94e1-587c1440fe87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b5f36c8fa728c5c34b846e1b603069f07726a5213fdf0bf0ae8b6b12b676b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a78e5a0164b37185d7cf25f07c0ea5a1fd6df0d23a1ff173bf085817d2f672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cm8wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:34Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:35 crc kubenswrapper[4752]: I0227 17:37:35.001748 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:34Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:35 crc kubenswrapper[4752]: I0227 17:37:35.021366 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a772c7aa8838bea02d77323a30292917c5106470a965f45d100e33e60b3d8e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:35Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:35 crc kubenswrapper[4752]: I0227 17:37:35.038463 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:35Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:35 crc kubenswrapper[4752]: I0227 17:37:35.060703 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntzss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e5e2ad1-375b-4340-a583-e32742e736e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://252ad8ab6d056c4b6aa5591b2eb7b61105a3e0ac19b98097f6bd5766b98c7f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://590b059f184b66a468123967bf7fe298308ae95e481a3e9d9489cbebc4034a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590b059f184b66a468123967bf7fe298308ae95e481a3e9d9489cbebc4034a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntzss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:35Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:35 crc kubenswrapper[4752]: I0227 17:37:35.906633 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:37:35 crc kubenswrapper[4752]: I0227 17:37:35.906717 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:35 crc kubenswrapper[4752]: E0227 17:37:35.906796 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:37:35 crc kubenswrapper[4752]: E0227 17:37:35.906934 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:37:35 crc kubenswrapper[4752]: I0227 17:37:35.907068 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:37:35 crc kubenswrapper[4752]: E0227 17:37:35.907205 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:37:35 crc kubenswrapper[4752]: I0227 17:37:35.907262 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:37:35 crc kubenswrapper[4752]: E0227 17:37:35.907351 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:37:36 crc kubenswrapper[4752]: E0227 17:37:36.020999 4752 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 17:37:37 crc kubenswrapper[4752]: I0227 17:37:37.903952 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:37:37 crc kubenswrapper[4752]: I0227 17:37:37.904085 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:37 crc kubenswrapper[4752]: I0227 17:37:37.904210 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:37:37 crc kubenswrapper[4752]: I0227 17:37:37.904273 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:37 crc kubenswrapper[4752]: E0227 17:37:37.904398 4752 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 17:37:37 crc kubenswrapper[4752]: E0227 17:37:37.904468 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 17:38:09.904447454 +0000 UTC m=+189.811264345 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 17:37:37 crc kubenswrapper[4752]: E0227 17:37:37.904533 4752 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 17:37:37 crc kubenswrapper[4752]: E0227 17:37:37.904635 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 17:37:37 crc kubenswrapper[4752]: E0227 17:37:37.904709 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 17:37:37 crc kubenswrapper[4752]: E0227 17:37:37.904678 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 17:38:09.904639089 +0000 UTC m=+189.811455980 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 17:37:37 crc kubenswrapper[4752]: E0227 17:37:37.904732 4752 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 17:37:37 crc kubenswrapper[4752]: E0227 17:37:37.904787 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:38:09.904753732 +0000 UTC m=+189.811570613 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:37:37 crc kubenswrapper[4752]: E0227 17:37:37.904834 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-27 17:38:09.904815543 +0000 UTC m=+189.811632564 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 17:37:37 crc kubenswrapper[4752]: I0227 17:37:37.906013 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:37:37 crc kubenswrapper[4752]: I0227 17:37:37.906041 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:37:37 crc kubenswrapper[4752]: E0227 17:37:37.906249 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:37:37 crc kubenswrapper[4752]: I0227 17:37:37.906326 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:37:37 crc kubenswrapper[4752]: E0227 17:37:37.906425 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:37:37 crc kubenswrapper[4752]: I0227 17:37:37.906473 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:37 crc kubenswrapper[4752]: E0227 17:37:37.906550 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:37:37 crc kubenswrapper[4752]: E0227 17:37:37.906626 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:37:38 crc kubenswrapper[4752]: I0227 17:37:38.005588 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/937bbb35-a3c2-435c-86c5-1072f3a54595-metrics-certs\") pod \"network-metrics-daemon-jkjwj\" (UID: \"937bbb35-a3c2-435c-86c5-1072f3a54595\") " pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:37:38 crc kubenswrapper[4752]: I0227 17:37:38.005714 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:37:38 crc kubenswrapper[4752]: E0227 17:37:38.005895 4752 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 17:37:38 crc kubenswrapper[4752]: E0227 17:37:38.005925 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 17:37:38 crc kubenswrapper[4752]: E0227 17:37:38.005962 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 17:37:38 crc kubenswrapper[4752]: E0227 17:37:38.005983 4752 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 17:37:38 crc kubenswrapper[4752]: E0227 17:37:38.006021 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/937bbb35-a3c2-435c-86c5-1072f3a54595-metrics-certs podName:937bbb35-a3c2-435c-86c5-1072f3a54595 nodeName:}" failed. No retries permitted until 2026-02-27 17:38:10.005992622 +0000 UTC m=+189.912809503 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/937bbb35-a3c2-435c-86c5-1072f3a54595-metrics-certs") pod "network-metrics-daemon-jkjwj" (UID: "937bbb35-a3c2-435c-86c5-1072f3a54595") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 17:37:38 crc kubenswrapper[4752]: E0227 17:37:38.006056 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-27 17:38:10.006032423 +0000 UTC m=+189.912849314 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 17:37:39 crc kubenswrapper[4752]: I0227 17:37:39.906236 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:37:39 crc kubenswrapper[4752]: I0227 17:37:39.906355 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:39 crc kubenswrapper[4752]: I0227 17:37:39.906386 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:37:39 crc kubenswrapper[4752]: I0227 17:37:39.906396 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:37:39 crc kubenswrapper[4752]: E0227 17:37:39.907596 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:37:39 crc kubenswrapper[4752]: E0227 17:37:39.907723 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:37:39 crc kubenswrapper[4752]: E0227 17:37:39.907797 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:37:39 crc kubenswrapper[4752]: E0227 17:37:39.907833 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:37:40 crc kubenswrapper[4752]: I0227 17:37:40.473621 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:37:40 crc kubenswrapper[4752]: I0227 17:37:40.473686 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:37:40 crc kubenswrapper[4752]: I0227 17:37:40.473707 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:37:40 crc kubenswrapper[4752]: I0227 17:37:40.473733 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:37:40 crc kubenswrapper[4752]: I0227 17:37:40.473754 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:37:40Z","lastTransitionTime":"2026-02-27T17:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:37:40 crc kubenswrapper[4752]: E0227 17:37:40.495540 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:40Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:40 crc kubenswrapper[4752]: I0227 17:37:40.501878 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:37:40 crc kubenswrapper[4752]: I0227 17:37:40.502094 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:37:40 crc kubenswrapper[4752]: I0227 17:37:40.502323 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:37:40 crc kubenswrapper[4752]: I0227 17:37:40.502489 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:37:40 crc kubenswrapper[4752]: I0227 17:37:40.502619 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:37:40Z","lastTransitionTime":"2026-02-27T17:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:37:40 crc kubenswrapper[4752]: E0227 17:37:40.523673 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:40Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:40 crc kubenswrapper[4752]: I0227 17:37:40.529612 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:37:40 crc kubenswrapper[4752]: I0227 17:37:40.529681 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:37:40 crc kubenswrapper[4752]: I0227 17:37:40.529700 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:37:40 crc kubenswrapper[4752]: I0227 17:37:40.529729 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:37:40 crc kubenswrapper[4752]: I0227 17:37:40.529749 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:37:40Z","lastTransitionTime":"2026-02-27T17:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:37:40 crc kubenswrapper[4752]: E0227 17:37:40.551679 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:40Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:40 crc kubenswrapper[4752]: I0227 17:37:40.557738 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:37:40 crc kubenswrapper[4752]: I0227 17:37:40.557809 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:37:40 crc kubenswrapper[4752]: I0227 17:37:40.557831 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:37:40 crc kubenswrapper[4752]: I0227 17:37:40.557858 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:37:40 crc kubenswrapper[4752]: I0227 17:37:40.557879 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:37:40Z","lastTransitionTime":"2026-02-27T17:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:37:40 crc kubenswrapper[4752]: E0227 17:37:40.581920 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:40Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:40 crc kubenswrapper[4752]: I0227 17:37:40.588170 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:37:40 crc kubenswrapper[4752]: I0227 17:37:40.588229 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:37:40 crc kubenswrapper[4752]: I0227 17:37:40.588248 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:37:40 crc kubenswrapper[4752]: I0227 17:37:40.588273 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:37:40 crc kubenswrapper[4752]: I0227 17:37:40.588295 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:37:40Z","lastTransitionTime":"2026-02-27T17:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:37:40 crc kubenswrapper[4752]: E0227 17:37:40.610086 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:40Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:40 crc kubenswrapper[4752]: E0227 17:37:40.610618 4752 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 17:37:40 crc kubenswrapper[4752]: I0227 17:37:40.926751 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a772c7aa8838bea02d77323a30292917c5106470a965f45d100e33e60b3d8e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:40Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:40 crc kubenswrapper[4752]: I0227 17:37:40.946629 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:40Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:40 crc kubenswrapper[4752]: I0227 17:37:40.965989 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntzss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e5e2ad1-375b-4340-a583-e32742e736e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://252ad8ab6d056c4b6aa5591b2eb7b61105a3e0ac19b98097f6bd5766b98c7f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://590b059f184b66a468123967bf7fe298308ae95e481a3e9d9489cbebc4034a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590b059f184b66a468123967bf7fe298308ae95e481a3e9d9489cbebc4034a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntzss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:40Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:40 crc kubenswrapper[4752]: I0227 17:37:40.982236 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:40Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:41 crc kubenswrapper[4752]: I0227 17:37:41.001758 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jkjwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"937bbb35-a3c2-435c-86c5-1072f3a54595\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jkjwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:40Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:41 crc kubenswrapper[4752]: I0227 17:37:41.017720 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vlk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dc23b1ea3fe03171bff4757ee01e66418bad442ac3a3489002239e543dbd6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccstl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vlk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:41Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:41 crc kubenswrapper[4752]: E0227 17:37:41.026543 4752 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 17:37:41 crc kubenswrapper[4752]: I0227 17:37:41.055664 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c7d88c1-f57d-4cd6-84b6-0d3a6d61366b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36a90f4c7c18fe68ad4aabb794bb89a722ba0e79f0d577e43cd4783b88781de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baf0bb2bec21350587defabe8172bf2a438de43ad4101a7ec15ea515e7a2624d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T17:35:33Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 17:35:03.074302 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 17:35:03.076341 1 observer_polling.go:159] Starting file observer\\\\nI0227 17:35:03.114487 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 17:35:03.117744 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 17:35:33.288292 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 17:35:33.288564 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:32Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07158ca3ff468fe4b8147ba5f4a6ec9e775d214521e3d5a99d9e7dde15e760a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ffbf0795b45df43efea5060fcbdef64063b360d6841281eb78a5179c2f62ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc81955110624a62223bd05aebbad30eb6a8467b33de77a99daacdd0f6ab7f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:41Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:41 crc kubenswrapper[4752]: I0227 17:37:41.078862 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://345b6aa5640e3a483968ee51953b07e13323f9a6ffed421509dc7a806d526100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:41Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:41 crc kubenswrapper[4752]: I0227 17:37:41.100866 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:41Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:41 crc kubenswrapper[4752]: I0227 17:37:41.118005 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d3acd38b3425ce4a5d9befc6cc4e74b0192874e4bc99516544c8502e71be3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98009a8a13b1dfd8f9ceb1385dc6e4c9e209083771716793c0822fe287e5dfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jpsg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:41Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:41 crc kubenswrapper[4752]: I0227 17:37:41.135243 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82c5f928-ac2b-4288-ac82-5ff668d4212b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b99ad96105738ae86e79d21cbf8ce43e85d7410e44e757f70bf0ca3da252fa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1978f9a878a01d29db332bc246fdf21a24d355bcd8c8580cb3a72169b68278bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1978f9a878a01d29db332bc246fdf21a24d355bcd8c8580cb3a72169b68278bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:41Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:41 crc kubenswrapper[4752]: I0227 17:37:41.157368 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2472139744ba3e03c75f0553bd60e1cb207dc09b8476cb53d54ba4f4490080e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ec71ef97366504dfbfefb8cb3f687f41639740074ce3199b54d15ee328e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:41Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:41 crc kubenswrapper[4752]: I0227 17:37:41.189381 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"690b0de6-1f38-4265-bfff-2077a349f89c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6226881bade67276c9d38e1bf58dee9b4a9d7037b749ded5c707f607cd89951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6226881bade67276c9d38e1bf58dee9b4a9d7037b749ded5c707f607cd89951\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T17:37:32Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 17:37:32.860048 7066 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 17:37:32.860071 7066 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 17:37:32.860098 7066 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 17:37:32.860136 7066 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 17:37:32.860186 7066 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0227 17:37:32.860205 7066 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 17:37:32.860209 7066 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 17:37:32.860226 7066 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 17:37:32.860238 7066 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 17:37:32.860234 7066 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 17:37:32.860212 7066 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 17:37:32.860249 7066 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0227 17:37:32.860280 7066 factory.go:656] Stopping watch factory\\\\nI0227 17:37:32.860294 7066 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 17:37:32.860301 7066 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sfztq_openshift-ovn-kubernetes(690b0de6-1f38-4265-bfff-2077a349f89c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sfztq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:41Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:41 crc kubenswrapper[4752]: I0227 17:37:41.205772 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jlxqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aad5923-f151-43de-a1a0-b8c6906c2d7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49f03929dd1e6f1510d46093e8ae9214eefc7c04d1db680cd69fde52c9344298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srhxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jlxqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:41Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:41 crc kubenswrapper[4752]: I0227 17:37:41.224866 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ce186c-640f-4ade-94e1-587c1440fe87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b5f36c8fa728c5c34b846e1b603069f07726a5213fdf0bf0ae8b6b12b676b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a78e5a0164b37185d7cf25f07c0ea5a1fd6df0d23a1ff173bf085817d2f672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cm8wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:41Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:41 crc kubenswrapper[4752]: I0227 17:37:41.240651 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qpbx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca0c39841636b80e8872853f9f5695cffa06ce37002e2eb03206a41a5b7be1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f5dqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qpbx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:41Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:41 crc kubenswrapper[4752]: I0227 17:37:41.264035 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1024425-74cb-401d-961a-72058a77a919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://269b9e60d7588f2345dfd731d886ef405b3ccb49e23ae2e676a3fa354a0c9d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba92a9228c8523b28a4a03f1f859339e21eac77405f37f5206e37ce4ee45d381\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9dd98b133cbae7c1dfa12a34dd6973dd4a5fc4340d34c1c4b3296d036456a82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T17:36:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 17:36:53.508646 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 17:36:53.508779 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 17:36:53.509492 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1825321194/tls.crt::/tmp/serving-cert-1825321194/tls.key\\\\\\\"\\\\nI0227 17:36:53.902105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 17:36:53.905975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 17:36:53.906023 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 17:36:53.906078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 17:36:53.906089 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 17:36:53.913727 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0227 17:36:53.913754 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913759 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913764 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 17:36:53.913768 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 17:36:53.913771 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 17:36:53.913774 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0227 17:36:53.913764 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0227 17:36:53.916552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c1fe8d41bdc04d8780aca50265c0e6cf43abbb1cda58aae8f78d54d7c52d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:41Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:41 crc kubenswrapper[4752]: I0227 17:37:41.905764 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:37:41 crc kubenswrapper[4752]: I0227 17:37:41.905859 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:37:41 crc kubenswrapper[4752]: E0227 17:37:41.906035 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:37:41 crc kubenswrapper[4752]: I0227 17:37:41.906205 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:37:41 crc kubenswrapper[4752]: I0227 17:37:41.906227 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:41 crc kubenswrapper[4752]: E0227 17:37:41.906364 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:37:41 crc kubenswrapper[4752]: E0227 17:37:41.906481 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:37:41 crc kubenswrapper[4752]: E0227 17:37:41.906621 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:37:43 crc kubenswrapper[4752]: I0227 17:37:43.905942 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:37:43 crc kubenswrapper[4752]: I0227 17:37:43.905992 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:37:43 crc kubenswrapper[4752]: I0227 17:37:43.906071 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:43 crc kubenswrapper[4752]: I0227 17:37:43.905944 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:37:43 crc kubenswrapper[4752]: E0227 17:37:43.906225 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:37:43 crc kubenswrapper[4752]: E0227 17:37:43.906269 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:37:43 crc kubenswrapper[4752]: E0227 17:37:43.906394 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:37:43 crc kubenswrapper[4752]: E0227 17:37:43.906508 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:37:45 crc kubenswrapper[4752]: I0227 17:37:45.906538 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:37:45 crc kubenswrapper[4752]: I0227 17:37:45.906610 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:45 crc kubenswrapper[4752]: I0227 17:37:45.906699 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:37:45 crc kubenswrapper[4752]: E0227 17:37:45.906758 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:37:45 crc kubenswrapper[4752]: I0227 17:37:45.906792 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:37:45 crc kubenswrapper[4752]: E0227 17:37:45.906920 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:37:45 crc kubenswrapper[4752]: E0227 17:37:45.907022 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:37:45 crc kubenswrapper[4752]: E0227 17:37:45.907268 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:37:46 crc kubenswrapper[4752]: E0227 17:37:46.027931 4752 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 17:37:46 crc kubenswrapper[4752]: I0227 17:37:46.907108 4752 scope.go:117] "RemoveContainer" containerID="847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0" Feb 27 17:37:46 crc kubenswrapper[4752]: E0227 17:37:46.907714 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 17:37:47 crc kubenswrapper[4752]: I0227 17:37:47.906456 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:37:47 crc kubenswrapper[4752]: I0227 17:37:47.906523 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:47 crc kubenswrapper[4752]: I0227 17:37:47.906549 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:37:47 crc kubenswrapper[4752]: I0227 17:37:47.906474 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:37:47 crc kubenswrapper[4752]: E0227 17:37:47.906678 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:37:47 crc kubenswrapper[4752]: E0227 17:37:47.906818 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:37:47 crc kubenswrapper[4752]: E0227 17:37:47.906956 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:37:47 crc kubenswrapper[4752]: E0227 17:37:47.907079 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:37:48 crc kubenswrapper[4752]: I0227 17:37:48.907176 4752 scope.go:117] "RemoveContainer" containerID="b6226881bade67276c9d38e1bf58dee9b4a9d7037b749ded5c707f607cd89951" Feb 27 17:37:48 crc kubenswrapper[4752]: E0227 17:37:48.907449 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sfztq_openshift-ovn-kubernetes(690b0de6-1f38-4265-bfff-2077a349f89c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" Feb 27 17:37:49 crc kubenswrapper[4752]: I0227 17:37:49.907018 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:37:49 crc kubenswrapper[4752]: I0227 17:37:49.907093 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:49 crc kubenswrapper[4752]: I0227 17:37:49.907174 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:37:49 crc kubenswrapper[4752]: E0227 17:37:49.907407 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:37:49 crc kubenswrapper[4752]: I0227 17:37:49.907433 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:37:49 crc kubenswrapper[4752]: E0227 17:37:49.907632 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:37:49 crc kubenswrapper[4752]: E0227 17:37:49.907853 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:37:49 crc kubenswrapper[4752]: E0227 17:37:49.908062 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:37:50 crc kubenswrapper[4752]: I0227 17:37:50.708784 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:37:50 crc kubenswrapper[4752]: I0227 17:37:50.708847 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:37:50 crc kubenswrapper[4752]: I0227 17:37:50.708868 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:37:50 crc kubenswrapper[4752]: I0227 17:37:50.708894 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:37:50 crc kubenswrapper[4752]: I0227 17:37:50.708913 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:37:50Z","lastTransitionTime":"2026-02-27T17:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:37:50 crc kubenswrapper[4752]: E0227 17:37:50.730477 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:50Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:50 crc kubenswrapper[4752]: I0227 17:37:50.734966 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:37:50 crc kubenswrapper[4752]: I0227 17:37:50.735020 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:37:50 crc kubenswrapper[4752]: I0227 17:37:50.735034 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:37:50 crc kubenswrapper[4752]: I0227 17:37:50.735056 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:37:50 crc kubenswrapper[4752]: I0227 17:37:50.735071 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:37:50Z","lastTransitionTime":"2026-02-27T17:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:37:50 crc kubenswrapper[4752]: E0227 17:37:50.754191 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:50Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:50 crc kubenswrapper[4752]: I0227 17:37:50.758708 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:37:50 crc kubenswrapper[4752]: I0227 17:37:50.758768 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:37:50 crc kubenswrapper[4752]: I0227 17:37:50.758786 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:37:50 crc kubenswrapper[4752]: I0227 17:37:50.758810 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:37:50 crc kubenswrapper[4752]: I0227 17:37:50.758828 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:37:50Z","lastTransitionTime":"2026-02-27T17:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:37:50 crc kubenswrapper[4752]: E0227 17:37:50.780821 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:50Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:50 crc kubenswrapper[4752]: I0227 17:37:50.786345 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:37:50 crc kubenswrapper[4752]: I0227 17:37:50.786401 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:37:50 crc kubenswrapper[4752]: I0227 17:37:50.786425 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:37:50 crc kubenswrapper[4752]: I0227 17:37:50.786451 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:37:50 crc kubenswrapper[4752]: I0227 17:37:50.786469 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:37:50Z","lastTransitionTime":"2026-02-27T17:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:37:50 crc kubenswrapper[4752]: E0227 17:37:50.802956 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:50Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:50 crc kubenswrapper[4752]: I0227 17:37:50.807398 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:37:50 crc kubenswrapper[4752]: I0227 17:37:50.807461 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:37:50 crc kubenswrapper[4752]: I0227 17:37:50.807480 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:37:50 crc kubenswrapper[4752]: I0227 17:37:50.807508 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:37:50 crc kubenswrapper[4752]: I0227 17:37:50.807525 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:37:50Z","lastTransitionTime":"2026-02-27T17:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:37:50 crc kubenswrapper[4752]: E0227 17:37:50.824877 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:37:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:50Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:50 crc kubenswrapper[4752]: E0227 17:37:50.825230 4752 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 17:37:50 crc kubenswrapper[4752]: I0227 17:37:50.928269 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://345b6aa5640e3a483968ee51953b07e13323f9a6ffed421509dc7a806d526100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:50Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:50 crc kubenswrapper[4752]: I0227 17:37:50.944576 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jkjwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"937bbb35-a3c2-435c-86c5-1072f3a54595\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jkjwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:50Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:50 crc kubenswrapper[4752]: I0227 17:37:50.959840 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vlk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dc23b1ea3fe03171bff4757ee01e66418bad442ac3a3489002239e543dbd6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccstl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vlk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:50Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:50 crc kubenswrapper[4752]: I0227 17:37:50.974972 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c7d88c1-f57d-4cd6-84b6-0d3a6d61366b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36a90f4c7c18fe68ad4aabb794bb89a722ba0e79f0d577e43cd4783b88781de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baf0bb2bec21350587defabe8172bf2a438de43ad4101a7ec15ea515e7a2624d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T17:35:33Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 17:35:03.074302 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 17:35:03.076341 1 observer_polling.go:159] Starting file observer\\\\nI0227 17:35:03.114487 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 17:35:03.117744 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 17:35:33.288292 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 17:35:33.288564 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:32Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07158ca3ff468fe4b8147ba5f4a6ec9e775d214521e3d5a99d9e7dde15e760a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ffbf0795b45df43efea5060fcbdef64063b360d6841281eb78a5179c2f62ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc81955110624a62223bd05aebbad30eb6a8467b33de77a99daacdd0f6ab7f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:50Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:50 crc kubenswrapper[4752]: I0227 17:37:50.992304 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2472139744ba3e03c75f0553bd60e1cb207dc09b8476cb53d54ba4f4490080e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ec71ef97366504dfbfefb8cb3f687f41639740074ce3199b54d15ee328e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:50Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:51 crc kubenswrapper[4752]: I0227 17:37:51.004769 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:51Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:51 crc kubenswrapper[4752]: I0227 17:37:51.018720 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d3acd38b3425ce4a5d9befc6cc4e74b0192874e4bc99516544c8502e71be3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98009a8a13b1dfd8f9ceb1385dc6e4c9e209083771716793c0822fe287e5dfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jpsg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:51Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:51 crc kubenswrapper[4752]: E0227 17:37:51.029853 4752 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 17:37:51 crc kubenswrapper[4752]: I0227 17:37:51.033737 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82c5f928-ac2b-4288-ac82-5ff668d4212b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b99ad96105738ae86e79d21cbf8ce43e85d7410e44e757f70bf0ca3da252fa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1978f9a878a01d29db332bc246fdf21a24d355bcd8c8580cb3a72169b68278bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1978f9a878a01d29db332bc246fdf21a24d355bcd8c8580cb3a72169b68278bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:51Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:51 crc kubenswrapper[4752]: I0227 17:37:51.067705 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"690b0de6-1f38-4265-bfff-2077a349f89c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6226881bade67276c9d38e1bf58dee9b4a9d7037b749ded5c707f607cd89951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6226881bade67276c9d38e1bf58dee9b4a9d7037b749ded5c707f607cd89951\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T17:37:32Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 17:37:32.860048 7066 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 17:37:32.860071 7066 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 17:37:32.860098 7066 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 17:37:32.860136 7066 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 17:37:32.860186 7066 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0227 17:37:32.860205 7066 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 17:37:32.860209 7066 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 17:37:32.860226 7066 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 17:37:32.860238 7066 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 17:37:32.860234 7066 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 17:37:32.860212 7066 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 17:37:32.860249 7066 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0227 17:37:32.860280 7066 factory.go:656] Stopping watch factory\\\\nI0227 17:37:32.860294 7066 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 17:37:32.860301 7066 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sfztq_openshift-ovn-kubernetes(690b0de6-1f38-4265-bfff-2077a349f89c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sfztq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:51Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:51 crc kubenswrapper[4752]: I0227 17:37:51.085072 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jlxqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aad5923-f151-43de-a1a0-b8c6906c2d7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49f03929dd1e6f1510d46093e8ae9214eefc7c04d1db680cd69fde52c9344298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srhxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jlxqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:51Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:51 crc kubenswrapper[4752]: I0227 17:37:51.103196 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ce186c-640f-4ade-94e1-587c1440fe87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b5f36c8fa728c5c34b846e1b603069f07726a5213fdf0bf0ae8b6b12b676b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a78e5a0164b37185d7cf25f07c0ea5a1fd6df0d23a1ff173bf085817d2f672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cm8wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:51Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:51 crc kubenswrapper[4752]: I0227 17:37:51.118456 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qpbx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca0c39841636b80e8872853f9f5695cffa06ce37002e2eb03206a41a5b7be1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f5dqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qpbx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:51Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:51 crc kubenswrapper[4752]: I0227 17:37:51.138719 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1024425-74cb-401d-961a-72058a77a919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://269b9e60d7588f2345dfd731d886ef405b3ccb49e23ae2e676a3fa354a0c9d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba92a9228c8523b28a4a03f1f859339e21eac77405f37f5206e37ce4ee45d381\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9dd98b133cbae7c1dfa12a34dd6973dd4a5fc4340d34c1c4b3296d036456a82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T17:36:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 17:36:53.508646 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 17:36:53.508779 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 17:36:53.509492 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1825321194/tls.crt::/tmp/serving-cert-1825321194/tls.key\\\\\\\"\\\\nI0227 17:36:53.902105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 17:36:53.905975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 17:36:53.906023 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 17:36:53.906078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 17:36:53.906089 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 17:36:53.913727 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0227 17:36:53.913754 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913759 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913764 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 17:36:53.913768 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 17:36:53.913771 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 17:36:53.913774 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0227 17:36:53.913764 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0227 17:36:53.916552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c1fe8d41bdc04d8780aca50265c0e6cf43abbb1cda58aae8f78d54d7c52d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:51Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:51 crc kubenswrapper[4752]: I0227 17:37:51.155864 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:51Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:51 crc kubenswrapper[4752]: I0227 17:37:51.172618 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a772c7aa8838bea02d77323a30292917c5106470a965f45d100e33e60b3d8e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:51Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:51 crc kubenswrapper[4752]: I0227 17:37:51.188534 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:51Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:51 crc kubenswrapper[4752]: I0227 17:37:51.207677 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntzss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e5e2ad1-375b-4340-a583-e32742e736e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://252ad8ab6d056c4b6aa5591b2eb7b61105a3e0ac19b98097f6bd5766b98c7f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://590b059f184b66a468123967bf7fe298308ae95e481a3e9d9489cbebc4034a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590b059f184b66a468123967bf7fe298308ae95e481a3e9d9489cbebc4034a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntzss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:51Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:51 crc kubenswrapper[4752]: I0227 17:37:51.906504 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:37:51 crc kubenswrapper[4752]: I0227 17:37:51.906591 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:51 crc kubenswrapper[4752]: E0227 17:37:51.906633 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:37:51 crc kubenswrapper[4752]: E0227 17:37:51.906752 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:37:51 crc kubenswrapper[4752]: I0227 17:37:51.906833 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:37:51 crc kubenswrapper[4752]: E0227 17:37:51.906895 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:37:51 crc kubenswrapper[4752]: I0227 17:37:51.906940 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:37:51 crc kubenswrapper[4752]: E0227 17:37:51.907034 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:37:51 crc kubenswrapper[4752]: I0227 17:37:51.922369 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 27 17:37:53 crc kubenswrapper[4752]: I0227 17:37:53.799181 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qpbx6_098f70a1-c2c2-44ce-9c0c-356e7eea2da9/kube-multus/0.log" Feb 27 17:37:53 crc kubenswrapper[4752]: I0227 17:37:53.799243 4752 generic.go:334] "Generic (PLEG): container finished" podID="098f70a1-c2c2-44ce-9c0c-356e7eea2da9" containerID="ca0c39841636b80e8872853f9f5695cffa06ce37002e2eb03206a41a5b7be1a1" exitCode=1 Feb 27 17:37:53 crc kubenswrapper[4752]: I0227 17:37:53.799285 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qpbx6" event={"ID":"098f70a1-c2c2-44ce-9c0c-356e7eea2da9","Type":"ContainerDied","Data":"ca0c39841636b80e8872853f9f5695cffa06ce37002e2eb03206a41a5b7be1a1"} Feb 27 17:37:53 crc kubenswrapper[4752]: I0227 17:37:53.799962 4752 scope.go:117] "RemoveContainer" containerID="ca0c39841636b80e8872853f9f5695cffa06ce37002e2eb03206a41a5b7be1a1" Feb 27 17:37:53 crc kubenswrapper[4752]: I0227 17:37:53.818313 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a772c7aa8838bea02d77323a30292917c5106470a965f45d100e33e60b3d8e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:53Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:53 crc kubenswrapper[4752]: I0227 17:37:53.839618 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:53Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:53 crc kubenswrapper[4752]: I0227 17:37:53.861072 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntzss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e5e2ad1-375b-4340-a583-e32742e736e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://252ad8ab6d056c4b6aa5591b2eb7b61105a3e0ac19b98097f6bd5766b98c7f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://590b059f184b66a468123967bf7fe298308ae95e481a3e9d9489cbebc4034a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590b059f184b66a468123967bf7fe298308ae95e481a3e9d9489cbebc4034a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntzss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:53Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:53 crc kubenswrapper[4752]: I0227 17:37:53.876766 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:53Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:53 crc kubenswrapper[4752]: I0227 17:37:53.895948 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jkjwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"937bbb35-a3c2-435c-86c5-1072f3a54595\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jkjwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:53Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:53 crc kubenswrapper[4752]: I0227 17:37:53.906592 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:37:53 crc kubenswrapper[4752]: E0227 17:37:53.907002 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:37:53 crc kubenswrapper[4752]: I0227 17:37:53.907189 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:53 crc kubenswrapper[4752]: E0227 17:37:53.907256 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:37:53 crc kubenswrapper[4752]: I0227 17:37:53.907365 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:37:53 crc kubenswrapper[4752]: E0227 17:37:53.907448 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:37:53 crc kubenswrapper[4752]: I0227 17:37:53.907569 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:37:53 crc kubenswrapper[4752]: E0227 17:37:53.907631 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:37:53 crc kubenswrapper[4752]: I0227 17:37:53.911922 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vlk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dc23b1ea3fe03171bff4757ee01e66418bad442ac3a3489002239e543dbd6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccstl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vlk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:53Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:53 crc kubenswrapper[4752]: I0227 17:37:53.932388 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c7d88c1-f57d-4cd6-84b6-0d3a6d61366b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36a90f4c7c18fe68ad4aabb794bb89a722ba0e79f0d577e43cd4783b88781de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baf0bb2bec21350587defabe8172bf2a438de43ad4101a7ec15ea515e7a2624d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T17:35:33Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 17:35:03.074302 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 17:35:03.076341 1 observer_polling.go:159] Starting file observer\\\\nI0227 17:35:03.114487 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 17:35:03.117744 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 17:35:33.288292 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 17:35:33.288564 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:32Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07158ca3ff468fe4b8147ba5f4a6ec9e775d214521e3d5a99d9e7dde15e760a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ffbf0795b45df43efea5060fcbdef64063b360d6841281eb78a5179c2f62ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc81955110624a62223bd05aebbad30eb6a8467b33de77a99daacdd0f6ab7f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:53Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:53 crc kubenswrapper[4752]: I0227 17:37:53.952484 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://345b6aa5640e3a483968ee51953b07e13323f9a6ffed421509dc7a806d526100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:53Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:53 crc kubenswrapper[4752]: I0227 17:37:53.966465 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:53Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:53 crc kubenswrapper[4752]: I0227 17:37:53.977031 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d3acd38b3425ce4a5d9befc6cc4e74b0192874e4bc99516544c8502e71be3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98009a8a13b1dfd8f9ceb1385dc6e4c9e209083771716793c0822fe287e5dfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jpsg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:53Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:53 crc kubenswrapper[4752]: I0227 17:37:53.987329 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82c5f928-ac2b-4288-ac82-5ff668d4212b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b99ad96105738ae86e79d21cbf8ce43e85d7410e44e757f70bf0ca3da252fa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1978f9a878a01d29db332bc246fdf21a24d355bcd8c8580cb3a72169b68278bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1978f9a878a01d29db332bc246fdf21a24d355bcd8c8580cb3a72169b68278bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:53Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:53 crc kubenswrapper[4752]: I0227 17:37:53.998694 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2472139744ba3e03c75f0553bd60e1cb207dc09b8476cb53d54ba4f4490080e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ec71ef97366504dfbfefb8cb3f687f41639740074ce3199b54d15ee328e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:53Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:54 crc kubenswrapper[4752]: I0227 17:37:54.020190 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"690b0de6-1f38-4265-bfff-2077a349f89c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6226881bade67276c9d38e1bf58dee9b4a9d7037b749ded5c707f607cd89951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6226881bade67276c9d38e1bf58dee9b4a9d7037b749ded5c707f607cd89951\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T17:37:32Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 17:37:32.860048 7066 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 17:37:32.860071 7066 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 17:37:32.860098 7066 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 17:37:32.860136 7066 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 17:37:32.860186 7066 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0227 17:37:32.860205 7066 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 17:37:32.860209 7066 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 17:37:32.860226 7066 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 17:37:32.860238 7066 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 17:37:32.860234 7066 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 17:37:32.860212 7066 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 17:37:32.860249 7066 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0227 17:37:32.860280 7066 factory.go:656] Stopping watch factory\\\\nI0227 17:37:32.860294 7066 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 17:37:32.860301 7066 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sfztq_openshift-ovn-kubernetes(690b0de6-1f38-4265-bfff-2077a349f89c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sfztq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:54Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:54 crc kubenswrapper[4752]: I0227 17:37:54.029584 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jlxqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aad5923-f151-43de-a1a0-b8c6906c2d7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49f03929dd1e6f1510d46093e8ae9214eefc7c04d1db680cd69fde52c9344298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srhxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jlxqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:54Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:54 crc kubenswrapper[4752]: I0227 17:37:54.044059 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ce186c-640f-4ade-94e1-587c1440fe87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b5f36c8fa728c5c34b846e1b603069f07726a5213fdf0bf0ae8b6b12b676b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a78e5a0164b37185d7cf25f07c0ea5a1fd6df0d23a1ff173bf085817d2f672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cm8wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:54Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:54 crc kubenswrapper[4752]: I0227 17:37:54.054574 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qpbx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca0c39841636b80e8872853f9f5695cffa06ce37002e2eb03206a41a5b7be1a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca0c39841636b80e8872853f9f5695cffa06ce37002e2eb03206a41a5b7be1a1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T17:37:53Z\\\",\\\"message\\\":\\\"2026-02-27T17:37:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_aa3147ee-560e-474f-bbcd-bd608ada4c24\\\\n2026-02-27T17:37:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_aa3147ee-560e-474f-bbcd-bd608ada4c24 to /host/opt/cni/bin/\\\\n2026-02-27T17:37:08Z [verbose] multus-daemon started\\\\n2026-02-27T17:37:08Z [verbose] Readiness Indicator file check\\\\n2026-02-27T17:37:53Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f5dqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qpbx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:54Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:54 crc kubenswrapper[4752]: I0227 17:37:54.068626 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1024425-74cb-401d-961a-72058a77a919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://269b9e60d7588f2345dfd731d886ef405b3ccb49e23ae2e676a3fa354a0c9d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba92a9228c8523b28a4a03f1f859339e21eac77405f37f5206e37ce4ee45d381\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9dd98b133cbae7c1dfa12a34dd6973dd4a5fc4340d34c1c4b3296d036456a82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T17:36:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 17:36:53.508646 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 17:36:53.508779 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 17:36:53.509492 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1825321194/tls.crt::/tmp/serving-cert-1825321194/tls.key\\\\\\\"\\\\nI0227 17:36:53.902105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 17:36:53.905975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 17:36:53.906023 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 17:36:53.906078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 17:36:53.906089 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 17:36:53.913727 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0227 17:36:53.913754 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913759 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913764 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 17:36:53.913768 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 17:36:53.913771 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 17:36:53.913774 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0227 17:36:53.913764 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0227 17:36:53.916552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c1fe8d41bdc04d8780aca50265c0e6cf43abbb1cda58aae8f78d54d7c52d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:54Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:54 crc kubenswrapper[4752]: I0227 17:37:54.094323 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dbf91e6-0901-4b3e-9435-ebb8cd4ada3f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://081e28957febf288c22fc5a14ff6072c732bb9035252e40bf7d60a7da810561c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1745cd5bb13aec093bf30d5cbfe3a1e32770fdc9b8b62d0bba6f746392f9f093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95f09f9b4d4d8489504262ea444217edbbd8c6b1233cd290473c935cf9615b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7c6d69fe9ecffb4c2945ffebd7a975ef7e9f8f6876d2286bd72207661c1ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6ebc1b83a48bc9ae778bbe3a64a5acefc3d6e7c9077021af7548bce8acd39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6435db3b3bd1e40014f3a292d5e81ba3995d75800b2639c2bb3c47f640b8cd61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6435db3b3bd1e40014f3a292d5e81ba3995d75800b2639c2bb3c47f640b8cd61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbae603425df2791994ee66c38d560123e07808f5112cdfa5b80a501cb37ecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dbae603425df2791994ee66c38d560123e07808f5112cdfa5b80a501cb37ecb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3b95784babd0a3f06aa90e29e671a93c9f3fc0b7d851b4f69b23dea2bff77090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b95784babd0a3f06aa90e29e671a93c9f3fc0b7d851b4f69b23dea2bff77090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:54Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:54 crc kubenswrapper[4752]: I0227 17:37:54.804735 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qpbx6_098f70a1-c2c2-44ce-9c0c-356e7eea2da9/kube-multus/0.log" Feb 27 17:37:54 crc kubenswrapper[4752]: I0227 17:37:54.804815 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qpbx6" event={"ID":"098f70a1-c2c2-44ce-9c0c-356e7eea2da9","Type":"ContainerStarted","Data":"d6903b8a1bfbbe982436ccb9b24b019f8996d6229249cc62f025c32ecbb56efe"} Feb 27 17:37:54 crc kubenswrapper[4752]: I0227 17:37:54.831678 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:54Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:54 crc kubenswrapper[4752]: I0227 17:37:54.850942 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a772c7aa8838bea02d77323a30292917c5106470a965f45d100e33e60b3d8e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:54Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:54 crc kubenswrapper[4752]: I0227 17:37:54.873823 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:54Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:54 crc kubenswrapper[4752]: I0227 17:37:54.899326 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntzss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e5e2ad1-375b-4340-a583-e32742e736e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://252ad8ab6d056c4b6aa5591b2eb7b61105a3e0ac19b98097f6bd5766b98c7f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://590b059f184b66a468123967bf7fe298308ae95e481a3e9d9489cbebc4034a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590b059f184b66a468123967bf7fe298308ae95e481a3e9d9489cbebc4034a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntzss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:54Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:54 crc kubenswrapper[4752]: I0227 17:37:54.924404 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c7d88c1-f57d-4cd6-84b6-0d3a6d61366b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36a90f4c7c18fe68ad4aabb794bb89a722ba0e79f0d577e43cd4783b88781de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baf0bb2bec21350587defabe8172bf2a438de43ad4101a7ec15ea515e7a2624d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T17:35:33Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 17:35:03.074302 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 17:35:03.076341 1 observer_polling.go:159] Starting file observer\\\\nI0227 17:35:03.114487 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 17:35:03.117744 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 17:35:33.288292 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 17:35:33.288564 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:32Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07158ca3ff468fe4b8147ba5f4a6ec9e775d214521e3d5a99d9e7dde15e760a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ffbf0795b45df43efea5060fcbdef64063b360d6841281eb78a5179c2f62ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc81955110624a62223bd05aebbad30eb6a8467b33de77a99daacdd0f6ab7f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:54Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:54 crc kubenswrapper[4752]: I0227 17:37:54.947806 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://345b6aa5640e3a483968ee51953b07e13323f9a6ffed421509dc7a806d526100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:54Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:54 crc kubenswrapper[4752]: I0227 17:37:54.963014 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jkjwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"937bbb35-a3c2-435c-86c5-1072f3a54595\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jkjwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:54Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:54 crc kubenswrapper[4752]: I0227 17:37:54.980305 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vlk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dc23b1ea3fe03171bff4757ee01e66418bad442ac3a3489002239e543dbd6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccstl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vlk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:54Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:54 crc kubenswrapper[4752]: I0227 17:37:54.997878 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82c5f928-ac2b-4288-ac82-5ff668d4212b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b99ad96105738ae86e79d21cbf8ce43e85d7410e44e757f70bf0ca3da252fa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1978f9a878a01d29db332bc246fdf21a24d355bcd8c8580cb3a72169b68278bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1978f9a878a01d29db332bc246fdf21a24d355bcd8c8580cb3a72169b68278bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:54Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:55 crc kubenswrapper[4752]: I0227 17:37:55.021733 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2472139744ba3e03c75f0553bd60e1cb207dc09b8476cb53d54ba4f4490080e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ec71ef97366504dfbfefb8cb3f687f41639740074ce3199b54d15ee328e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:55Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:55 crc kubenswrapper[4752]: I0227 17:37:55.044241 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:55Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:55 crc kubenswrapper[4752]: I0227 17:37:55.062555 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d3acd38b3425ce4a5d9befc6cc4e74b0192874e4bc99516544c8502e71be3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98009a8a13b1dfd8f9ceb1385dc6e4c9e209083771716793c0822fe287e5dfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jpsg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:55Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:55 crc kubenswrapper[4752]: I0227 17:37:55.085779 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1024425-74cb-401d-961a-72058a77a919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://269b9e60d7588f2345dfd731d886ef405b3ccb49e23ae2e676a3fa354a0c9d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba92a9228c8523b28a4a03f1f859339e21eac77405f37f5206e37ce4ee45d381\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9dd98b133cbae7c1dfa12a34dd6973dd4a5fc4340d34c1c4b3296d036456a82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T17:36:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 17:36:53.508646 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 17:36:53.508779 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 17:36:53.509492 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1825321194/tls.crt::/tmp/serving-cert-1825321194/tls.key\\\\\\\"\\\\nI0227 17:36:53.902105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 17:36:53.905975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 17:36:53.906023 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 17:36:53.906078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 17:36:53.906089 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 17:36:53.913727 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0227 17:36:53.913754 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913759 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913764 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 17:36:53.913768 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 17:36:53.913771 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 17:36:53.913774 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0227 17:36:53.913764 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0227 17:36:53.916552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c1fe8d41bdc04d8780aca50265c0e6cf43abbb1cda58aae8f78d54d7c52d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:55Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:55 crc kubenswrapper[4752]: I0227 17:37:55.111058 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dbf91e6-0901-4b3e-9435-ebb8cd4ada3f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://081e28957febf288c22fc5a14ff6072c732bb9035252e40bf7d60a7da810561c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1745cd5bb13aec093bf30d5cbfe3a1e32770fdc9b8b62d0bba6f746392f9f093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95f09f9b4d4d8489504262ea444217edbbd8c6b1233cd290473c935cf9615b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7c6d69fe9ecffb4c2945ffebd7a975ef7e9f8f6876d2286bd72207661c1ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6ebc1b83a48bc9ae778bbe3a64a5acefc3d6e7c9077021af7548bce8acd39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6435db3b3bd1e40014f3a292d5e81ba3995d75800b2639c2bb3c47f640b8cd61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6435db3b3bd1e40014f3a292d5e81ba3995d75800b2639c2bb3c47f640b8cd61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbae603425df2791994ee66c38d560123e07808f5112cdfa5b80a501cb37ecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dbae603425df2791994ee66c38d560123e07808f5112cdfa5b80a501cb37ecb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3b95784babd0a3f06aa90e29e671a93c9f3fc0b7d851b4f69b23dea2bff77090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b95784babd0a3f06aa90e29e671a93c9f3fc0b7d851b4f69b23dea2bff77090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:55Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:55 crc kubenswrapper[4752]: I0227 17:37:55.142099 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"690b0de6-1f38-4265-bfff-2077a349f89c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6226881bade67276c9d38e1bf58dee9b4a9d7037b749ded5c707f607cd89951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6226881bade67276c9d38e1bf58dee9b4a9d7037b749ded5c707f607cd89951\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T17:37:32Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 17:37:32.860048 7066 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 17:37:32.860071 7066 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 17:37:32.860098 7066 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 17:37:32.860136 7066 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 17:37:32.860186 7066 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0227 17:37:32.860205 7066 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 17:37:32.860209 7066 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 17:37:32.860226 7066 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 17:37:32.860238 7066 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 17:37:32.860234 7066 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 17:37:32.860212 7066 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 17:37:32.860249 7066 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0227 17:37:32.860280 7066 factory.go:656] Stopping watch factory\\\\nI0227 17:37:32.860294 7066 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 17:37:32.860301 7066 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sfztq_openshift-ovn-kubernetes(690b0de6-1f38-4265-bfff-2077a349f89c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sfztq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:55Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:55 crc kubenswrapper[4752]: I0227 17:37:55.158307 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jlxqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aad5923-f151-43de-a1a0-b8c6906c2d7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49f03929dd1e6f1510d46093e8ae9214eefc7c04d1db680cd69fde52c9344298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srhxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jlxqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:55Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:55 crc kubenswrapper[4752]: I0227 17:37:55.177516 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ce186c-640f-4ade-94e1-587c1440fe87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b5f36c8fa728c5c34b846e1b603069f07726a5213fdf0bf0ae8b6b12b676b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a78e5a0164b37185d7cf25f07c0ea5a1fd6df0d23a1ff173bf085817d2f672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cm8wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:55Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:55 crc kubenswrapper[4752]: I0227 17:37:55.193423 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qpbx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6903b8a1bfbbe982436ccb9b24b019f8996d6229249cc62f025c32ecbb56efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca0c39841636b80e8872853f9f5695cffa06ce37002e2eb03206a41a5b7be1a1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T17:37:53Z\\\",\\\"message\\\":\\\"2026-02-27T17:37:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_aa3147ee-560e-474f-bbcd-bd608ada4c24\\\\n2026-02-27T17:37:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_aa3147ee-560e-474f-bbcd-bd608ada4c24 to /host/opt/cni/bin/\\\\n2026-02-27T17:37:08Z [verbose] multus-daemon started\\\\n2026-02-27T17:37:08Z [verbose] Readiness Indicator file check\\\\n2026-02-27T17:37:53Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f5dqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qpbx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:37:55Z is after 2025-08-24T17:21:41Z" Feb 27 17:37:56 crc kubenswrapper[4752]: I0227 17:37:56.148814 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:37:56 crc kubenswrapper[4752]: I0227 17:37:56.148950 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:37:56 crc kubenswrapper[4752]: I0227 17:37:56.148908 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:37:56 crc kubenswrapper[4752]: I0227 17:37:56.149032 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:56 crc kubenswrapper[4752]: E0227 17:37:56.149402 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:37:56 crc kubenswrapper[4752]: E0227 17:37:56.149404 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:37:56 crc kubenswrapper[4752]: E0227 17:37:56.149597 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:37:56 crc kubenswrapper[4752]: E0227 17:37:56.149766 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:37:56 crc kubenswrapper[4752]: E0227 17:37:56.149819 4752 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 17:37:57 crc kubenswrapper[4752]: I0227 17:37:57.906784 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:37:57 crc kubenswrapper[4752]: E0227 17:37:57.906944 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:37:57 crc kubenswrapper[4752]: I0227 17:37:57.907480 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:37:57 crc kubenswrapper[4752]: I0227 17:37:57.907524 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:37:57 crc kubenswrapper[4752]: E0227 17:37:57.907612 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:37:57 crc kubenswrapper[4752]: E0227 17:37:57.907718 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:37:57 crc kubenswrapper[4752]: I0227 17:37:57.907569 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:57 crc kubenswrapper[4752]: E0227 17:37:57.908598 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:37:57 crc kubenswrapper[4752]: I0227 17:37:57.920601 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 27 17:37:59 crc kubenswrapper[4752]: I0227 17:37:59.906759 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:37:59 crc kubenswrapper[4752]: I0227 17:37:59.906870 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:37:59 crc kubenswrapper[4752]: I0227 17:37:59.906884 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:37:59 crc kubenswrapper[4752]: E0227 17:37:59.907001 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:37:59 crc kubenswrapper[4752]: I0227 17:37:59.907038 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:37:59 crc kubenswrapper[4752]: E0227 17:37:59.907391 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:37:59 crc kubenswrapper[4752]: E0227 17:37:59.907589 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:37:59 crc kubenswrapper[4752]: E0227 17:37:59.907744 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:37:59 crc kubenswrapper[4752]: I0227 17:37:59.908013 4752 scope.go:117] "RemoveContainer" containerID="847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0" Feb 27 17:37:59 crc kubenswrapper[4752]: E0227 17:37:59.908361 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 17:38:00 crc kubenswrapper[4752]: I0227 17:38:00.907441 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:38:00 crc kubenswrapper[4752]: I0227 17:38:00.907508 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:38:00 crc kubenswrapper[4752]: I0227 17:38:00.907526 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:38:00 crc kubenswrapper[4752]: I0227 17:38:00.907553 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:38:00 crc kubenswrapper[4752]: I0227 17:38:00.907570 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:38:00Z","lastTransitionTime":"2026-02-27T17:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:38:00 crc kubenswrapper[4752]: I0227 17:38:00.907919 4752 scope.go:117] "RemoveContainer" containerID="b6226881bade67276c9d38e1bf58dee9b4a9d7037b749ded5c707f607cd89951" Feb 27 17:38:00 crc kubenswrapper[4752]: E0227 17:38:00.929472 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:38:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:38:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:38:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:38:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:38:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:38:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:38:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:38:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:00Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:00 crc kubenswrapper[4752]: I0227 17:38:00.929779 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2462e5c5-9a89-44fd-a239-ae0bcb28474b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9103e7281307aa7f6922806e847df158e614abbd9d2cfc02b79d88d99ef6125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f34f57ccf635b4f036f46ae32672363739001a0ff840bfad06781d72976b74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa560d4b47bb3dd6066fa62fa08c6cb2e4e736e2e7be83517df2da33a3e18037\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cde8121f5098e2ad1799ff64e985d3bf8f7321b1136e669aa787d1c2970f6440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cde8121f5098e2ad1799ff64e985d3bf8f7321b1136e669aa787d1c2970f6440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:00Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:00 crc kubenswrapper[4752]: I0227 17:38:00.936173 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:38:00 crc kubenswrapper[4752]: I0227 17:38:00.936226 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:38:00 crc kubenswrapper[4752]: I0227 17:38:00.936294 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:38:00 crc kubenswrapper[4752]: I0227 17:38:00.937252 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:38:00 crc kubenswrapper[4752]: I0227 17:38:00.937451 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:38:00Z","lastTransitionTime":"2026-02-27T17:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:38:00 crc kubenswrapper[4752]: I0227 17:38:00.949312 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:00Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:00 crc kubenswrapper[4752]: E0227 17:38:00.954551 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:38:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:38:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:38:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:38:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:38:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:38:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:38:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:38:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:00Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:00 crc kubenswrapper[4752]: I0227 17:38:00.958597 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:38:00 crc kubenswrapper[4752]: I0227 17:38:00.958637 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:38:00 crc kubenswrapper[4752]: I0227 17:38:00.958646 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:38:00 crc kubenswrapper[4752]: I0227 17:38:00.958660 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:38:00 crc kubenswrapper[4752]: I0227 17:38:00.958671 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:38:00Z","lastTransitionTime":"2026-02-27T17:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:38:00 crc kubenswrapper[4752]: I0227 17:38:00.967424 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a772c7aa8838bea02d77323a30292917c5106470a965f45d100e33e60b3d8e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:00Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:00 crc kubenswrapper[4752]: E0227 17:38:00.978883 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:38:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:38:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:38:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:38:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:38:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:38:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:38:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:38:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:00Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:00 crc kubenswrapper[4752]: I0227 17:38:00.983514 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:00Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:00 crc kubenswrapper[4752]: I0227 17:38:00.984738 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:38:00 crc kubenswrapper[4752]: I0227 17:38:00.984793 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:38:00 crc kubenswrapper[4752]: I0227 17:38:00.984802 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:38:00 crc kubenswrapper[4752]: I0227 17:38:00.984815 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:38:00 crc kubenswrapper[4752]: I0227 17:38:00.984844 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:38:00Z","lastTransitionTime":"2026-02-27T17:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:38:01 crc kubenswrapper[4752]: E0227 17:38:01.003122 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:38:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:38:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:38:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:38:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:38:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:38:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:38:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:38:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:01Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.005798 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntzss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e5e2ad1-375b-4340-a583-e32742e736e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://252ad8ab6d056c4b6aa5591b2eb7b61105a3e0ac19b98097f6bd5766b98c7f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://590b059f184b66a468123967bf7fe298308ae95e481a3e9d9489cbebc4034a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590b059f184b66a468123967bf7fe298308ae95e481a3e9d9489cbebc4034a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntzss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:01Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.008411 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.008473 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.008492 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.008516 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.008532 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:38:01Z","lastTransitionTime":"2026-02-27T17:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:38:01 crc kubenswrapper[4752]: E0227 17:38:01.026202 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:38:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:38:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:38:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:38:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:38:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:38:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:38:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:38:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:01Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:01 crc kubenswrapper[4752]: E0227 17:38:01.026331 4752 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.030922 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c7d88c1-f57d-4cd6-84b6-0d3a6d61366b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36a90f4c7c18fe68ad4aabb794bb89a722ba0e79f0d577e43cd4783b88781de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baf0bb2bec21350587defabe8172bf2a438de43ad4101a7ec15ea515e7a2624d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T17:35:33Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 17:35:03.074302 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 17:35:03.076341 1 observer_polling.go:159] Starting file observer\\\\nI0227 17:35:03.114487 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 17:35:03.117744 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 17:35:33.288292 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 17:35:33.288564 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:32Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07158ca3ff468fe4b8147ba5f4a6ec9e775d214521e3d5a99d9e7dde15e760a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ffbf0795b45df43efea5060fcbdef64063b360d6841281eb78a5179c2f62ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc81955110624a62223bd05aebbad30eb6a8467b33de77a99daacdd0f6ab7f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:01Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.051070 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://345b6aa5640e3a483968ee51953b07e13323f9a6ffed421509dc7a806d526100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:01Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.068614 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jkjwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"937bbb35-a3c2-435c-86c5-1072f3a54595\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jkjwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:01Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.087465 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vlk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dc23b1ea3fe03171bff4757ee01e66418bad442ac3a3489002239e543dbd6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccstl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vlk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:01Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.104427 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82c5f928-ac2b-4288-ac82-5ff668d4212b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b99ad96105738ae86e79d21cbf8ce43e85d7410e44e757f70bf0ca3da252fa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1978f9a878a01d29db332bc246fdf21a24d355bcd8c8580cb3a72169b68278bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1978f9a878a01d29db332bc246fdf21a24d355bcd8c8580cb3a72169b68278bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:01Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.125940 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2472139744ba3e03c75f0553bd60e1cb207dc09b8476cb53d54ba4f4490080e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ec71ef97366504dfbfefb8cb3f687f41639740074ce3199b54d15ee328e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:01Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.149721 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:01Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:01 crc kubenswrapper[4752]: E0227 17:38:01.152049 4752 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.170413 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d3acd38b3425ce4a5d9befc6cc4e74b0192874e4bc99516544c8502e71be3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98009a8a13b1dfd8f9ceb1385dc6e4c9e209083771716793c0822fe287e5dfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jpsg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:01Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.172270 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sfztq_690b0de6-1f38-4265-bfff-2077a349f89c/ovnkube-controller/2.log" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.176115 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" event={"ID":"690b0de6-1f38-4265-bfff-2077a349f89c","Type":"ContainerStarted","Data":"6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a"} Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.176784 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.186245 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qpbx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6903b8a1bfbbe982436ccb9b24b019f8996d6229249cc62f025c32ecbb56efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca0c39841636b80e8872853f9f5695cffa06ce37002e2eb03206a41a5b7be1a1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T17:37:53Z\\\",\\\"message\\\":\\\"2026-02-27T17:37:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_aa3147ee-560e-474f-bbcd-bd608ada4c24\\\\n2026-02-27T17:37:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_aa3147ee-560e-474f-bbcd-bd608ada4c24 to /host/opt/cni/bin/\\\\n2026-02-27T17:37:08Z [verbose] multus-daemon started\\\\n2026-02-27T17:37:08Z [verbose] Readiness Indicator file check\\\\n2026-02-27T17:37:53Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f5dqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qpbx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:01Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.201716 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1024425-74cb-401d-961a-72058a77a919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://269b9e60d7588f2345dfd731d886ef405b3ccb49e23ae2e676a3fa354a0c9d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba92a9228c8523b28a4a03f1f859339e21eac77405f37f5206e37ce4ee45d381\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9dd98b133cbae7c1dfa12a34dd6973dd4a5fc4340d34c1c4b3296d036456a82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T17:36:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 17:36:53.508646 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 17:36:53.508779 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 17:36:53.509492 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1825321194/tls.crt::/tmp/serving-cert-1825321194/tls.key\\\\\\\"\\\\nI0227 17:36:53.902105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 17:36:53.905975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 17:36:53.906023 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 17:36:53.906078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 17:36:53.906089 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 17:36:53.913727 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0227 17:36:53.913754 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913759 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913764 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 17:36:53.913768 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 17:36:53.913771 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 17:36:53.913774 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0227 17:36:53.913764 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0227 17:36:53.916552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c1fe8d41bdc04d8780aca50265c0e6cf43abbb1cda58aae8f78d54d7c52d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:01Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.224272 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dbf91e6-0901-4b3e-9435-ebb8cd4ada3f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://081e28957febf288c22fc5a14ff6072c732bb9035252e40bf7d60a7da810561c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1745cd5bb13aec093bf30d5cbfe3a1e32770fdc9b8b62d0bba6f746392f9f093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95f09f9b4d4d8489504262ea444217edbbd8c6b1233cd290473c935cf9615b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7c6d69fe9ecffb4c2945ffebd7a975ef7e9f8f6876d2286bd72207661c1ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6ebc1b83a48bc9ae778bbe3a64a5acefc3d6e7c9077021af7548bce8acd39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6435db3b3bd1e40014f3a292d5e81ba3995d75800b2639c2bb3c47f640b8cd61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6435db3b3bd1e40014f3a292d5e81ba3995d75800b2639c2bb3c47f640b8cd61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbae603425df2791994ee66c38d560123e07808f5112cdfa5b80a501cb37ecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dbae603425df2791994ee66c38d560123e07808f5112cdfa5b80a501cb37ecb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3b95784babd0a3f06aa90e29e671a93c9f3fc0b7d851b4f69b23dea2bff77090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b95784babd0a3f06aa90e29e671a93c9f3fc0b7d851b4f69b23dea2bff77090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:01Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.253118 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"690b0de6-1f38-4265-bfff-2077a349f89c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6226881bade67276c9d38e1bf58dee9b4a9d7037b749ded5c707f607cd89951\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6226881bade67276c9d38e1bf58dee9b4a9d7037b749ded5c707f607cd89951\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T17:37:32Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 17:37:32.860048 7066 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 17:37:32.860071 7066 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 17:37:32.860098 7066 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 17:37:32.860136 7066 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 17:37:32.860186 7066 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0227 17:37:32.860205 7066 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 17:37:32.860209 7066 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 17:37:32.860226 7066 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 17:37:32.860238 7066 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 17:37:32.860234 7066 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 17:37:32.860212 7066 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 17:37:32.860249 7066 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0227 17:37:32.860280 7066 factory.go:656] Stopping watch factory\\\\nI0227 17:37:32.860294 7066 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 17:37:32.860301 7066 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sfztq_openshift-ovn-kubernetes(690b0de6-1f38-4265-bfff-2077a349f89c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sfztq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:01Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.267226 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jlxqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aad5923-f151-43de-a1a0-b8c6906c2d7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49f03929dd1e6f1510d46093e8ae9214eefc7c04d1db680cd69fde52c9344298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srhxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jlxqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:01Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.285581 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ce186c-640f-4ade-94e1-587c1440fe87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b5f36c8fa728c5c34b846e1b603069f07726a5213fdf0bf0ae8b6b12b676b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a78e5a0164b37185d7cf25f07c0ea5a1fd6df0d23a1ff173bf085817d2f672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cm8wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:01Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.301901 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2462e5c5-9a89-44fd-a239-ae0bcb28474b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9103e7281307aa7f6922806e847df158e614abbd9d2cfc02b79d88d99ef6125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f34f57ccf635b4f036f46ae32672363739001a0ff840bfad06781d72976b74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa560d4b47bb3dd6066fa62fa08c6cb2e4e736e2e7be83517df2da33a3e18037\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cde8121f5098e2ad1799ff64e985d3bf8f7321b1136e669aa787d1c2970f6440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cde8121f5098e2ad1799ff64e985d3bf8f7321b1136e669aa787d1c2970f6440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:01Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.316927 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:01Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.332676 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a772c7aa8838bea02d77323a30292917c5106470a965f45d100e33e60b3d8e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:01Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.347315 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:01Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.368817 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntzss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e5e2ad1-375b-4340-a583-e32742e736e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://252ad8ab6d056c4b6aa5591b2eb7b61105a3e0ac19b98097f6bd5766b98c7f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://590b059f184b66a468123967bf7fe298308ae95e481a3e9d9489cbebc4034a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590b059f184b66a468123967bf7fe298308ae95e481a3e9d9489cbebc4034a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntzss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:01Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.382395 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c7d88c1-f57d-4cd6-84b6-0d3a6d61366b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36a90f4c7c18fe68ad4aabb794bb89a722ba0e79f0d577e43cd4783b88781de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baf0bb2bec21350587defabe8172bf2a438de43ad4101a7ec15ea515e7a2624d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T17:35:33Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 17:35:03.074302 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 17:35:03.076341 1 observer_polling.go:159] Starting file observer\\\\nI0227 17:35:03.114487 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 17:35:03.117744 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 17:35:33.288292 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 17:35:33.288564 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:32Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07158ca3ff468fe4b8147ba5f4a6ec9e775d214521e3d5a99d9e7dde15e760a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ffbf0795b45df43efea5060fcbdef64063b360d6841281eb78a5179c2f62ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc81955110624a62223bd05aebbad30eb6a8467b33de77a99daacdd0f6ab7f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:01Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.395539 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://345b6aa5640e3a483968ee51953b07e13323f9a6ffed421509dc7a806d526100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:01Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.415742 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jkjwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"937bbb35-a3c2-435c-86c5-1072f3a54595\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jkjwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:01Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.436735 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vlk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dc23b1ea3fe03171bff4757ee01e66418bad442ac3a3489002239e543dbd6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccstl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vlk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:01Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.451170 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82c5f928-ac2b-4288-ac82-5ff668d4212b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b99ad96105738ae86e79d21cbf8ce43e85d7410e44e757f70bf0ca3da252fa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1978f9a878a01d29db332bc246fdf21a24d355bcd8c8580cb3a72169b68278bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1978f9a878a01d29db332bc246fdf21a24d355bcd8c8580cb3a72169b68278bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:01Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.474752 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2472139744ba3e03c75f0553bd60e1cb207dc09b8476cb53d54ba4f4490080e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ec71ef97366504dfbfefb8cb3f687f41639740074ce3199b54d15ee328e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:01Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.514922 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:01Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.536499 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d3acd38b3425ce4a5d9befc6cc4e74b0192874e4bc99516544c8502e71be3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98009a8a13b1dfd8f9ceb1385dc6e4c9e209083771716793c0822fe287e5dfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jpsg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:01Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.559904 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1024425-74cb-401d-961a-72058a77a919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://269b9e60d7588f2345dfd731d886ef405b3ccb49e23ae2e676a3fa354a0c9d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba92a9228c8523b28a4a03f1f859339e21eac77405f37f5206e37ce4ee45d381\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9dd98b133cbae7c1dfa12a34dd6973dd4a5fc4340d34c1c4b3296d036456a82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T17:36:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 17:36:53.508646 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 17:36:53.508779 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 17:36:53.509492 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1825321194/tls.crt::/tmp/serving-cert-1825321194/tls.key\\\\\\\"\\\\nI0227 17:36:53.902105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 17:36:53.905975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 17:36:53.906023 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 17:36:53.906078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 17:36:53.906089 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 17:36:53.913727 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0227 17:36:53.913754 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913759 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913764 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 17:36:53.913768 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 17:36:53.913771 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 17:36:53.913774 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0227 17:36:53.913764 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0227 17:36:53.916552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c1fe8d41bdc04d8780aca50265c0e6cf43abbb1cda58aae8f78d54d7c52d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:01Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.592959 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dbf91e6-0901-4b3e-9435-ebb8cd4ada3f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://081e28957febf288c22fc5a14ff6072c732bb9035252e40bf7d60a7da810561c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1745cd5bb13aec093bf30d5cbfe3a1e32770fdc9b8b62d0bba6f746392f9f093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95f09f9b4d4d8489504262ea444217edbbd8c6b1233cd290473c935cf9615b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7c6d69fe9ecffb4c2945ffebd7a975ef7e9f8f6876d2286bd72207661c1ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6ebc1b83a48bc9ae778bbe3a64a5acefc3d6e7c9077021af7548bce8acd39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6435db3b3bd1e40014f3a292d5e81ba3995d75800b2639c2bb3c47f640b8cd61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6435db3b3bd1e40014f3a292d5e81ba3995d75800b2639c2bb3c47f640b8cd61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbae603425df2791994ee66c38d560123e07808f5112cdfa5b80a501cb37ecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dbae603425df2791994ee66c38d560123e07808f5112cdfa5b80a501cb37ecb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3b95784babd0a3f06aa90e29e671a93c9f3fc0b7d851b4f69b23dea2bff77090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b95784babd0a3f06aa90e29e671a93c9f3fc0b7d851b4f69b23dea2bff77090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:01Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.619767 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"690b0de6-1f38-4265-bfff-2077a349f89c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6226881bade67276c9d38e1bf58dee9b4a9d7037b749ded5c707f607cd89951\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T17:37:32Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 17:37:32.860048 7066 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 17:37:32.860071 7066 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 17:37:32.860098 7066 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 17:37:32.860136 7066 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 17:37:32.860186 7066 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0227 17:37:32.860205 7066 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 17:37:32.860209 7066 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 17:37:32.860226 7066 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 17:37:32.860238 7066 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 17:37:32.860234 7066 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 17:37:32.860212 7066 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 17:37:32.860249 7066 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0227 17:37:32.860280 7066 factory.go:656] Stopping watch factory\\\\nI0227 17:37:32.860294 7066 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 17:37:32.860301 7066 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sfztq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:01Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.631012 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jlxqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aad5923-f151-43de-a1a0-b8c6906c2d7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49f03929dd1e6f1510d46093e8ae9214eefc7c04d1db680cd69fde52c9344298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srhxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jlxqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:01Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.642135 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ce186c-640f-4ade-94e1-587c1440fe87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b5f36c8fa728c5c34b846e1b603069f07726a5213fdf0bf0ae8b6b12b676b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a78e5a0164b37185d7cf25f07c0ea5a1fd6df0d23a1ff173bf085817d2f672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cm8wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:01Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.655636 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qpbx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6903b8a1bfbbe982436ccb9b24b019f8996d6229249cc62f025c32ecbb56efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca0c39841636b80e8872853f9f5695cffa06ce37002e2eb03206a41a5b7be1a1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T17:37:53Z\\\",\\\"message\\\":\\\"2026-02-27T17:37:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_aa3147ee-560e-474f-bbcd-bd608ada4c24\\\\n2026-02-27T17:37:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_aa3147ee-560e-474f-bbcd-bd608ada4c24 to /host/opt/cni/bin/\\\\n2026-02-27T17:37:08Z [verbose] multus-daemon started\\\\n2026-02-27T17:37:08Z [verbose] Readiness Indicator file check\\\\n2026-02-27T17:37:53Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f5dqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qpbx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:01Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.906408 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:38:01 crc kubenswrapper[4752]: E0227 17:38:01.906567 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.906642 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:38:01 crc kubenswrapper[4752]: E0227 17:38:01.906696 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.906784 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:38:01 crc kubenswrapper[4752]: I0227 17:38:01.906796 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:38:01 crc kubenswrapper[4752]: E0227 17:38:01.906985 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:38:01 crc kubenswrapper[4752]: E0227 17:38:01.907109 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:38:02 crc kubenswrapper[4752]: I0227 17:38:02.181492 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sfztq_690b0de6-1f38-4265-bfff-2077a349f89c/ovnkube-controller/3.log" Feb 27 17:38:02 crc kubenswrapper[4752]: I0227 17:38:02.183223 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sfztq_690b0de6-1f38-4265-bfff-2077a349f89c/ovnkube-controller/2.log" Feb 27 17:38:02 crc kubenswrapper[4752]: I0227 17:38:02.187182 4752 generic.go:334] "Generic (PLEG): container finished" podID="690b0de6-1f38-4265-bfff-2077a349f89c" containerID="6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a" exitCode=1 Feb 27 17:38:02 crc kubenswrapper[4752]: I0227 17:38:02.187243 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" event={"ID":"690b0de6-1f38-4265-bfff-2077a349f89c","Type":"ContainerDied","Data":"6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a"} Feb 27 17:38:02 crc kubenswrapper[4752]: I0227 17:38:02.187355 4752 scope.go:117] "RemoveContainer" containerID="b6226881bade67276c9d38e1bf58dee9b4a9d7037b749ded5c707f607cd89951" Feb 27 17:38:02 crc kubenswrapper[4752]: I0227 17:38:02.188530 4752 scope.go:117] "RemoveContainer" containerID="6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a" Feb 27 17:38:02 crc kubenswrapper[4752]: E0227 17:38:02.188876 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-sfztq_openshift-ovn-kubernetes(690b0de6-1f38-4265-bfff-2077a349f89c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" Feb 27 17:38:02 crc kubenswrapper[4752]: I0227 17:38:02.208837 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ce186c-640f-4ade-94e1-587c1440fe87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b5f36c8fa728c5c34b846e1b603069f07726a5213fdf0bf0ae8b6b12b676b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a78e5a0164b37185d7cf25f07c0ea5a1fd6df0d23a1ff173bf085817d2f672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cm8wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:02Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:02 crc kubenswrapper[4752]: I0227 17:38:02.231080 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qpbx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6903b8a1bfbbe982436ccb9b24b019f8996d6229249cc62f025c32ecbb56efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca0c39841636b80e8872853f9f5695cffa06ce37002e2eb03206a41a5b7be1a1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T17:37:53Z\\\",\\\"message\\\":\\\"2026-02-27T17:37:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_aa3147ee-560e-474f-bbcd-bd608ada4c24\\\\n2026-02-27T17:37:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_aa3147ee-560e-474f-bbcd-bd608ada4c24 to /host/opt/cni/bin/\\\\n2026-02-27T17:37:08Z [verbose] multus-daemon started\\\\n2026-02-27T17:37:08Z [verbose] Readiness Indicator file check\\\\n2026-02-27T17:37:53Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f5dqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qpbx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:02Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:02 crc kubenswrapper[4752]: I0227 17:38:02.252835 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1024425-74cb-401d-961a-72058a77a919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://269b9e60d7588f2345dfd731d886ef405b3ccb49e23ae2e676a3fa354a0c9d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba92a9228c8523b28a4a03f1f859339e21eac77405f37f5206e37ce4ee45d381\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9dd98b133cbae7c1dfa12a34dd6973dd4a5fc4340d34c1c4b3296d036456a82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T17:36:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 17:36:53.508646 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 17:36:53.508779 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 17:36:53.509492 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1825321194/tls.crt::/tmp/serving-cert-1825321194/tls.key\\\\\\\"\\\\nI0227 17:36:53.902105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 17:36:53.905975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 17:36:53.906023 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 17:36:53.906078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 17:36:53.906089 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 17:36:53.913727 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0227 17:36:53.913754 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913759 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913764 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 17:36:53.913768 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 17:36:53.913771 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 17:36:53.913774 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0227 17:36:53.913764 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0227 17:36:53.916552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c1fe8d41bdc04d8780aca50265c0e6cf43abbb1cda58aae8f78d54d7c52d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:02Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:02 crc kubenswrapper[4752]: I0227 17:38:02.278455 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dbf91e6-0901-4b3e-9435-ebb8cd4ada3f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://081e28957febf288c22fc5a14ff6072c732bb9035252e40bf7d60a7da810561c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1745cd5bb13aec093bf30d5cbfe3a1e32770fdc9b8b62d0bba6f746392f9f093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95f09f9b4d4d8489504262ea444217edbbd8c6b1233cd290473c935cf9615b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7c6d69fe9ecffb4c2945ffebd7a975ef7e9f8f6876d2286bd72207661c1ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6ebc1b83a48bc9ae778bbe3a64a5acefc3d6e7c9077021af7548bce8acd39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6435db3b3bd1e40014f3a292d5e81ba3995d75800b2639c2bb3c47f640b8cd61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6435db3b3bd1e40014f3a292d5e81ba3995d75800b2639c2bb3c47f640b8cd61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbae603425df2791994ee66c38d560123e07808f5112cdfa5b80a501cb37ecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dbae603425df2791994ee66c38d560123e07808f5112cdfa5b80a501cb37ecb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3b95784babd0a3f06aa90e29e671a93c9f3fc0b7d851b4f69b23dea2bff77090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b95784babd0a3f06aa90e29e671a93c9f3fc0b7d851b4f69b23dea2bff77090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:02Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:02 crc kubenswrapper[4752]: I0227 17:38:02.299867 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"690b0de6-1f38-4265-bfff-2077a349f89c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6226881bade67276c9d38e1bf58dee9b4a9d7037b749ded5c707f607cd89951\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T17:37:32Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0227 17:37:32.860048 7066 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 17:37:32.860071 7066 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0227 17:37:32.860098 7066 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0227 17:37:32.860136 7066 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 17:37:32.860186 7066 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0227 17:37:32.860205 7066 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 17:37:32.860209 7066 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 17:37:32.860226 7066 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 17:37:32.860238 7066 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0227 17:37:32.860234 7066 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 17:37:32.860212 7066 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 17:37:32.860249 7066 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0227 17:37:32.860280 7066 factory.go:656] Stopping watch factory\\\\nI0227 17:37:32.860294 7066 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 17:37:32.860301 7066 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T17:38:02Z\\\",\\\"message\\\":\\\"oval\\\\nI0227 17:38:02.085677 7352 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0227 17:38:02.085688 7352 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0227 17:38:02.085722 7352 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 17:38:02.085736 7352 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 17:38:02.085744 7352 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 17:38:02.085752 7352 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0227 17:38:02.085760 7352 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 17:38:02.087017 7352 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 17:38:02.087065 7352 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 17:38:02.087085 7352 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 17:38:02.087092 7352 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 17:38:02.087114 7352 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 17:38:02.087120 7352 factory.go:656] Stopping watch factory\\\\nI0227 17:38:02.087132 7352 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 17:38:02.087164 7352 ovnkube.go:599] Stopped ovnkube\\\\nI0227 17:38:02.087173 7352 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 17:38:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:38:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sfztq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:02Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:02 crc kubenswrapper[4752]: I0227 17:38:02.313023 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jlxqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aad5923-f151-43de-a1a0-b8c6906c2d7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49f03929dd1e6f1510d46093e8ae9214eefc7c04d1db680cd69fde52c9344298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srhxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jlxqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:02Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:02 crc kubenswrapper[4752]: I0227 17:38:02.330814 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntzss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e5e2ad1-375b-4340-a583-e32742e736e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://252ad8ab6d056c4b6aa5591b2eb7b61105a3e0ac19b98097f6bd5766b98c7f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://590b059f184b66a468123967bf7fe298308ae95e481a3e9d9489cbebc4034a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590b059f184b66a468123967bf7fe298308ae95e481a3e9d9489cbebc4034a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntzss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:02Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:02 crc kubenswrapper[4752]: I0227 17:38:02.345637 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2462e5c5-9a89-44fd-a239-ae0bcb28474b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9103e7281307aa7f6922806e847df158e614abbd9d2cfc02b79d88d99ef6125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f34f57ccf635b4f036f46ae32672363739001a0ff840bfad06781d72976b74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa560d4b47bb3dd6066fa62fa08c6cb2e4e736e2e7be83517df2da33a3e18037\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cde8121f5098e2ad1799ff64e985d3bf8f7321b1136e669aa787d1c2970f6440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cde8121f5098e2ad1799ff64e985d3bf8f7321b1136e669aa787d1c2970f6440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:02Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:02 crc kubenswrapper[4752]: I0227 17:38:02.363083 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:02Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:02 crc kubenswrapper[4752]: I0227 17:38:02.378501 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a772c7aa8838bea02d77323a30292917c5106470a965f45d100e33e60b3d8e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:02Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:02 crc kubenswrapper[4752]: I0227 17:38:02.394995 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:02Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:02 crc kubenswrapper[4752]: I0227 17:38:02.409935 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c7d88c1-f57d-4cd6-84b6-0d3a6d61366b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36a90f4c7c18fe68ad4aabb794bb89a722ba0e79f0d577e43cd4783b88781de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baf0bb2bec21350587defabe8172bf2a438de43ad4101a7ec15ea515e7a2624d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T17:35:33Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 17:35:03.074302 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 17:35:03.076341 1 observer_polling.go:159] Starting file observer\\\\nI0227 17:35:03.114487 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 17:35:03.117744 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 17:35:33.288292 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 17:35:33.288564 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:32Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07158ca3ff468fe4b8147ba5f4a6ec9e775d214521e3d5a99d9e7dde15e760a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ffbf0795b45df43efea5060fcbdef64063b360d6841281eb78a5179c2f62ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc81955110624a62223bd05aebbad30eb6a8467b33de77a99daacdd0f6ab7f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:02Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:02 crc kubenswrapper[4752]: I0227 17:38:02.427352 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://345b6aa5640e3a483968ee51953b07e13323f9a6ffed421509dc7a806d526100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:02Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:02 crc kubenswrapper[4752]: I0227 17:38:02.445213 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jkjwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"937bbb35-a3c2-435c-86c5-1072f3a54595\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jkjwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:02Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:02 crc kubenswrapper[4752]: I0227 17:38:02.458339 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vlk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dc23b1ea3fe03171bff4757ee01e66418bad442ac3a3489002239e543dbd6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccstl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vlk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:02Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:02 crc kubenswrapper[4752]: I0227 17:38:02.473944 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82c5f928-ac2b-4288-ac82-5ff668d4212b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b99ad96105738ae86e79d21cbf8ce43e85d7410e44e757f70bf0ca3da252fa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1978f9a878a01d29db332bc246fdf21a24d355bcd8c8580cb3a72169b68278bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1978f9a878a01d29db332bc246fdf21a24d355bcd8c8580cb3a72169b68278bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:02Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:02 crc kubenswrapper[4752]: I0227 17:38:02.492456 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2472139744ba3e03c75f0553bd60e1cb207dc09b8476cb53d54ba4f4490080e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ec71ef97366504dfbfefb8cb3f687f41639740074ce3199b54d15ee328e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:02Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:02 crc kubenswrapper[4752]: I0227 17:38:02.513105 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:02Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:02 crc kubenswrapper[4752]: I0227 17:38:02.532220 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d3acd38b3425ce4a5d9befc6cc4e74b0192874e4bc99516544c8502e71be3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98009a8a13b1dfd8f9ceb1385dc6e4c9e209083771716793c0822fe287e5dfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jpsg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:02Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:03 crc kubenswrapper[4752]: I0227 17:38:03.192712 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sfztq_690b0de6-1f38-4265-bfff-2077a349f89c/ovnkube-controller/3.log" Feb 27 17:38:03 crc kubenswrapper[4752]: I0227 17:38:03.198195 4752 scope.go:117] "RemoveContainer" containerID="6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a" Feb 27 17:38:03 crc kubenswrapper[4752]: E0227 17:38:03.198568 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-sfztq_openshift-ovn-kubernetes(690b0de6-1f38-4265-bfff-2077a349f89c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" Feb 27 17:38:03 crc kubenswrapper[4752]: I0227 17:38:03.218135 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2462e5c5-9a89-44fd-a239-ae0bcb28474b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9103e7281307aa7f6922806e847df158e614abbd9d2cfc02b79d88d99ef6125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f34f57ccf635b4f036f46ae32672363739001a0ff840bfad06781d72976b74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa560d4b47bb3dd6066fa62fa08c6cb2e4e736e2e7be83517df2da33a3e18037\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cde8121f5098e2ad1799ff64e985d3bf8f7321b1136e669aa787d1c2970f6440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cde8121f5098e2ad1799ff64e985d3bf8f7321b1136e669aa787d1c2970f6440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:03Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:03 crc kubenswrapper[4752]: I0227 17:38:03.238858 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:03Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:03 crc kubenswrapper[4752]: I0227 17:38:03.258252 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a772c7aa8838bea02d77323a30292917c5106470a965f45d100e33e60b3d8e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:03Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:03 crc kubenswrapper[4752]: I0227 17:38:03.279081 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:03Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:03 crc kubenswrapper[4752]: I0227 17:38:03.308229 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntzss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e5e2ad1-375b-4340-a583-e32742e736e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://252ad8ab6d056c4b6aa5591b2eb7b61105a3e0ac19b98097f6bd5766b98c7f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://590b059f184b66a468123967bf7fe298308ae95e481a3e9d9489cbebc4034a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590b059f184b66a468123967bf7fe298308ae95e481a3e9d9489cbebc4034a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntzss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:03Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:03 crc kubenswrapper[4752]: I0227 17:38:03.327474 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c7d88c1-f57d-4cd6-84b6-0d3a6d61366b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36a90f4c7c18fe68ad4aabb794bb89a722ba0e79f0d577e43cd4783b88781de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baf0bb2bec21350587defabe8172bf2a438de43ad4101a7ec15ea515e7a2624d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T17:35:33Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 17:35:03.074302 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 17:35:03.076341 1 observer_polling.go:159] Starting file observer\\\\nI0227 17:35:03.114487 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 17:35:03.117744 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 17:35:33.288292 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 17:35:33.288564 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:32Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07158ca3ff468fe4b8147ba5f4a6ec9e775d214521e3d5a99d9e7dde15e760a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ffbf0795b45df43efea5060fcbdef64063b360d6841281eb78a5179c2f62ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc81955110624a62223bd05aebbad30eb6a8467b33de77a99daacdd0f6ab7f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:03Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:03 crc kubenswrapper[4752]: I0227 17:38:03.345409 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://345b6aa5640e3a483968ee51953b07e13323f9a6ffed421509dc7a806d526100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:03Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:03 crc kubenswrapper[4752]: I0227 17:38:03.361069 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jkjwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"937bbb35-a3c2-435c-86c5-1072f3a54595\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jkjwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:03Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:03 crc kubenswrapper[4752]: I0227 17:38:03.373866 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vlk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dc23b1ea3fe03171bff4757ee01e66418bad442ac3a3489002239e543dbd6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccstl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vlk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:03Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:03 crc kubenswrapper[4752]: I0227 17:38:03.386525 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82c5f928-ac2b-4288-ac82-5ff668d4212b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b99ad96105738ae86e79d21cbf8ce43e85d7410e44e757f70bf0ca3da252fa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1978f9a878a01d29db332bc246fdf21a24d355bcd8c8580cb3a72169b68278bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1978f9a878a01d29db332bc246fdf21a24d355bcd8c8580cb3a72169b68278bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:03Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:03 crc kubenswrapper[4752]: I0227 17:38:03.408225 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2472139744ba3e03c75f0553bd60e1cb207dc09b8476cb53d54ba4f4490080e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ec71ef97366504dfbfefb8cb3f687f41639740074ce3199b54d15ee328e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:03Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:03 crc kubenswrapper[4752]: I0227 17:38:03.424708 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:03Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:03 crc kubenswrapper[4752]: I0227 17:38:03.443853 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d3acd38b3425ce4a5d9befc6cc4e74b0192874e4bc99516544c8502e71be3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98009a8a13b1dfd8f9ceb1385dc6e4c9e209083771716793c0822fe287e5dfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jpsg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:03Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:03 crc kubenswrapper[4752]: I0227 17:38:03.461666 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1024425-74cb-401d-961a-72058a77a919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://269b9e60d7588f2345dfd731d886ef405b3ccb49e23ae2e676a3fa354a0c9d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba92a9228c8523b28a4a03f1f859339e21eac77405f37f5206e37ce4ee45d381\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9dd98b133cbae7c1dfa12a34dd6973dd4a5fc4340d34c1c4b3296d036456a82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T17:36:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 17:36:53.508646 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 17:36:53.508779 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 17:36:53.509492 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1825321194/tls.crt::/tmp/serving-cert-1825321194/tls.key\\\\\\\"\\\\nI0227 17:36:53.902105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 17:36:53.905975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 17:36:53.906023 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 17:36:53.906078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 17:36:53.906089 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 17:36:53.913727 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0227 17:36:53.913754 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913759 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913764 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 17:36:53.913768 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 17:36:53.913771 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 17:36:53.913774 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0227 17:36:53.913764 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0227 17:36:53.916552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c1fe8d41bdc04d8780aca50265c0e6cf43abbb1cda58aae8f78d54d7c52d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:03Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:03 crc kubenswrapper[4752]: I0227 17:38:03.486101 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dbf91e6-0901-4b3e-9435-ebb8cd4ada3f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://081e28957febf288c22fc5a14ff6072c732bb9035252e40bf7d60a7da810561c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1745cd5bb13aec093bf30d5cbfe3a1e32770fdc9b8b62d0bba6f746392f9f093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95f09f9b4d4d8489504262ea444217edbbd8c6b1233cd290473c935cf9615b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7c6d69fe9ecffb4c2945ffebd7a975ef7e9f8f6876d2286bd72207661c1ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6ebc1b83a48bc9ae778bbe3a64a5acefc3d6e7c9077021af7548bce8acd39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6435db3b3bd1e40014f3a292d5e81ba3995d75800b2639c2bb3c47f640b8cd61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6435db3b3bd1e40014f3a292d5e81ba3995d75800b2639c2bb3c47f640b8cd61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbae603425df2791994ee66c38d560123e07808f5112cdfa5b80a501cb37ecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dbae603425df2791994ee66c38d560123e07808f5112cdfa5b80a501cb37ecb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3b95784babd0a3f06aa90e29e671a93c9f3fc0b7d851b4f69b23dea2bff77090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b95784babd0a3f06aa90e29e671a93c9f3fc0b7d851b4f69b23dea2bff77090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:03Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:03 crc kubenswrapper[4752]: I0227 17:38:03.512742 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"690b0de6-1f38-4265-bfff-2077a349f89c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T17:38:02Z\\\",\\\"message\\\":\\\"oval\\\\nI0227 17:38:02.085677 7352 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0227 17:38:02.085688 7352 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0227 17:38:02.085722 7352 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 17:38:02.085736 7352 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 17:38:02.085744 7352 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 17:38:02.085752 7352 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0227 17:38:02.085760 7352 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 17:38:02.087017 7352 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 17:38:02.087065 7352 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 17:38:02.087085 7352 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 17:38:02.087092 7352 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 17:38:02.087114 7352 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 17:38:02.087120 7352 factory.go:656] Stopping watch factory\\\\nI0227 17:38:02.087132 7352 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 17:38:02.087164 7352 ovnkube.go:599] Stopped ovnkube\\\\nI0227 17:38:02.087173 7352 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 17:38:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:38:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-sfztq_openshift-ovn-kubernetes(690b0de6-1f38-4265-bfff-2077a349f89c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sfztq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:03Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:03 crc kubenswrapper[4752]: I0227 17:38:03.525457 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jlxqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aad5923-f151-43de-a1a0-b8c6906c2d7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49f03929dd1e6f1510d46093e8ae9214eefc7c04d1db680cd69fde52c9344298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srhxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jlxqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:03Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:03 crc kubenswrapper[4752]: I0227 17:38:03.540400 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ce186c-640f-4ade-94e1-587c1440fe87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b5f36c8fa728c5c34b846e1b603069f07726a5213fdf0bf0ae8b6b12b676b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a78e5a0164b37185d7cf25f07c0ea5a1fd6df0d23a1ff173bf085817d2f672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cm8wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:03Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:03 crc kubenswrapper[4752]: I0227 17:38:03.556661 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qpbx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6903b8a1bfbbe982436ccb9b24b019f8996d6229249cc62f025c32ecbb56efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca0c39841636b80e8872853f9f5695cffa06ce37002e2eb03206a41a5b7be1a1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T17:37:53Z\\\",\\\"message\\\":\\\"2026-02-27T17:37:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_aa3147ee-560e-474f-bbcd-bd608ada4c24\\\\n2026-02-27T17:37:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_aa3147ee-560e-474f-bbcd-bd608ada4c24 to /host/opt/cni/bin/\\\\n2026-02-27T17:37:08Z [verbose] multus-daemon started\\\\n2026-02-27T17:37:08Z [verbose] Readiness Indicator file check\\\\n2026-02-27T17:37:53Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f5dqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qpbx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:03Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:03 crc kubenswrapper[4752]: I0227 17:38:03.905927 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:38:03 crc kubenswrapper[4752]: I0227 17:38:03.905996 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:38:03 crc kubenswrapper[4752]: I0227 17:38:03.906011 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:38:03 crc kubenswrapper[4752]: I0227 17:38:03.905949 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:38:03 crc kubenswrapper[4752]: E0227 17:38:03.906231 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:38:03 crc kubenswrapper[4752]: E0227 17:38:03.906335 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:38:03 crc kubenswrapper[4752]: E0227 17:38:03.906388 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:38:03 crc kubenswrapper[4752]: E0227 17:38:03.906461 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:38:05 crc kubenswrapper[4752]: I0227 17:38:05.906413 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:38:05 crc kubenswrapper[4752]: I0227 17:38:05.906492 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:38:05 crc kubenswrapper[4752]: I0227 17:38:05.906438 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:38:05 crc kubenswrapper[4752]: I0227 17:38:05.906598 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:38:05 crc kubenswrapper[4752]: E0227 17:38:05.906802 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:38:05 crc kubenswrapper[4752]: E0227 17:38:05.906885 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:38:05 crc kubenswrapper[4752]: E0227 17:38:05.906721 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:38:05 crc kubenswrapper[4752]: E0227 17:38:05.906992 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:38:06 crc kubenswrapper[4752]: E0227 17:38:06.152897 4752 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 17:38:07 crc kubenswrapper[4752]: I0227 17:38:07.906543 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:38:07 crc kubenswrapper[4752]: I0227 17:38:07.906597 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:38:07 crc kubenswrapper[4752]: I0227 17:38:07.906548 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:38:07 crc kubenswrapper[4752]: I0227 17:38:07.906675 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:38:07 crc kubenswrapper[4752]: E0227 17:38:07.906779 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:38:07 crc kubenswrapper[4752]: E0227 17:38:07.907463 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:38:07 crc kubenswrapper[4752]: E0227 17:38:07.907659 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:38:07 crc kubenswrapper[4752]: E0227 17:38:07.907954 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:38:09 crc kubenswrapper[4752]: I0227 17:38:09.906184 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:38:09 crc kubenswrapper[4752]: I0227 17:38:09.906184 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:38:09 crc kubenswrapper[4752]: E0227 17:38:09.906923 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:38:09 crc kubenswrapper[4752]: I0227 17:38:09.906232 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:38:09 crc kubenswrapper[4752]: E0227 17:38:09.907018 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:38:09 crc kubenswrapper[4752]: I0227 17:38:09.906184 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:38:09 crc kubenswrapper[4752]: E0227 17:38:09.907063 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:38:09 crc kubenswrapper[4752]: E0227 17:38:09.907211 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:38:09 crc kubenswrapper[4752]: I0227 17:38:09.997938 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:38:09 crc kubenswrapper[4752]: I0227 17:38:09.998136 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:38:09 crc kubenswrapper[4752]: E0227 17:38:09.998234 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:13.998178627 +0000 UTC m=+253.904995518 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:38:09 crc kubenswrapper[4752]: E0227 17:38:09.998399 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 17:38:09 crc kubenswrapper[4752]: E0227 17:38:09.998441 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 17:38:09 crc kubenswrapper[4752]: E0227 17:38:09.998468 4752 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 17:38:09 crc kubenswrapper[4752]: E0227 17:38:09.998509 4752 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 17:38:09 crc kubenswrapper[4752]: I0227 17:38:09.998400 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:38:09 crc kubenswrapper[4752]: E0227 17:38:09.998568 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-27 17:39:13.998537956 +0000 UTC m=+253.905354847 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 17:38:09 crc kubenswrapper[4752]: E0227 17:38:09.998645 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 17:39:13.998632448 +0000 UTC m=+253.905449309 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 27 17:38:09 crc kubenswrapper[4752]: I0227 17:38:09.998666 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:38:09 crc kubenswrapper[4752]: E0227 17:38:09.998831 4752 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 17:38:09 crc kubenswrapper[4752]: E0227 17:38:09.998885 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-27 17:39:13.998876494 +0000 UTC m=+253.905693355 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 27 17:38:10 crc kubenswrapper[4752]: I0227 17:38:10.099849 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:38:10 crc kubenswrapper[4752]: I0227 17:38:10.099958 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/937bbb35-a3c2-435c-86c5-1072f3a54595-metrics-certs\") pod \"network-metrics-daemon-jkjwj\" (UID: \"937bbb35-a3c2-435c-86c5-1072f3a54595\") " pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:38:10 crc kubenswrapper[4752]: E0227 17:38:10.100215 4752 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 17:38:10 crc kubenswrapper[4752]: E0227 17:38:10.100251 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 27 17:38:10 crc kubenswrapper[4752]: E0227 17:38:10.100308 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/937bbb35-a3c2-435c-86c5-1072f3a54595-metrics-certs podName:937bbb35-a3c2-435c-86c5-1072f3a54595 nodeName:}" failed. No retries permitted until 2026-02-27 17:39:14.100282568 +0000 UTC m=+254.007099459 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/937bbb35-a3c2-435c-86c5-1072f3a54595-metrics-certs") pod "network-metrics-daemon-jkjwj" (UID: "937bbb35-a3c2-435c-86c5-1072f3a54595") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 27 17:38:10 crc kubenswrapper[4752]: E0227 17:38:10.100327 4752 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 27 17:38:10 crc kubenswrapper[4752]: E0227 17:38:10.100354 4752 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 17:38:10 crc kubenswrapper[4752]: E0227 17:38:10.100494 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-27 17:39:14.100443472 +0000 UTC m=+254.007260493 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 27 17:38:10 crc kubenswrapper[4752]: I0227 17:38:10.931050 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1024425-74cb-401d-961a-72058a77a919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://269b9e60d7588f2345dfd731d886ef405b3ccb49e23ae2e676a3fa354a0c9d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba92a9228c8523b28a4a03f1f859339e21eac77405f37f5206e37ce4ee45d381\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9dd98b133cbae7c1dfa12a34dd6973dd4a5fc4340d34c1c4b3296d036456a82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T17:36:53Z\\\",\\\"message\\\":\\\"le observer\\\\nW0227 17:36:53.508646 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0227 17:36:53.508779 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0227 17:36:53.509492 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1825321194/tls.crt::/tmp/serving-cert-1825321194/tls.key\\\\\\\"\\\\nI0227 17:36:53.902105 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0227 17:36:53.905975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0227 17:36:53.906023 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0227 17:36:53.906078 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0227 17:36:53.906089 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0227 17:36:53.913727 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0227 17:36:53.913754 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913759 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0227 17:36:53.913764 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0227 17:36:53.913768 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0227 17:36:53.913771 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0227 17:36:53.913774 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0227 17:36:53.913764 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0227 17:36:53.916552 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:36:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90c1fe8d41bdc04d8780aca50265c0e6cf43abbb1cda58aae8f78d54d7c52d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:10Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:10 crc kubenswrapper[4752]: I0227 17:38:10.966856 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dbf91e6-0901-4b3e-9435-ebb8cd4ada3f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://081e28957febf288c22fc5a14ff6072c732bb9035252e40bf7d60a7da810561c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1745cd5bb13aec093bf30d5cbfe3a1e32770fdc9b8b62d0bba6f746392f9f093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95f09f9b4d4d8489504262ea444217edbbd8c6b1233cd290473c935cf9615b90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7c6d69fe9ecffb4c2945ffebd7a975ef7e9f8f6876d2286bd72207661c1ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eb6ebc1b83a48bc9ae778bbe3a64a5acefc3d6e7c9077021af7548bce8acd39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6435db3b3bd1e40014f3a292d5e81ba3995d75800b2639c2bb3c47f640b8cd61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6435db3b3bd1e40014f3a292d5e81ba3995d75800b2639c2bb3c47f640b8cd61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbae603425df2791994ee66c38d560123e07808f5112cdfa5b80a501cb37ecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dbae603425df2791994ee66c38d560123e07808f5112cdfa5b80a501cb37ecb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3b95784babd0a3f06aa90e29e671a93c9f3fc0b7d851b4f69b23dea2bff77090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b95784babd0a3f06aa90e29e671a93c9f3fc0b7d851b4f69b23dea2bff77090\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:10Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:10 crc kubenswrapper[4752]: I0227 17:38:10.999201 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"690b0de6-1f38-4265-bfff-2077a349f89c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T17:38:02Z\\\",\\\"message\\\":\\\"oval\\\\nI0227 17:38:02.085677 7352 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0227 17:38:02.085688 7352 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0227 17:38:02.085722 7352 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0227 17:38:02.085736 7352 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0227 17:38:02.085744 7352 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0227 17:38:02.085752 7352 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0227 17:38:02.085760 7352 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0227 17:38:02.087017 7352 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0227 17:38:02.087065 7352 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0227 17:38:02.087085 7352 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0227 17:38:02.087092 7352 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0227 17:38:02.087114 7352 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0227 17:38:02.087120 7352 factory.go:656] Stopping watch factory\\\\nI0227 17:38:02.087132 7352 handler.go:208] Removed *v1.Node event handler 7\\\\nI0227 17:38:02.087164 7352 ovnkube.go:599] Stopped ovnkube\\\\nI0227 17:38:02.087173 7352 handler.go:208] Removed *v1.Node event handler 2\\\\nI0227 17:38:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:38:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-sfztq_openshift-ovn-kubernetes(690b0de6-1f38-4265-bfff-2077a349f89c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhb87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sfztq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:10Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.016429 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jlxqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aad5923-f151-43de-a1a0-b8c6906c2d7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49f03929dd1e6f1510d46093e8ae9214eefc7c04d1db680cd69fde52c9344298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srhxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jlxqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.035396 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ce186c-640f-4ade-94e1-587c1440fe87\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75b5f36c8fa728c5c34b846e1b603069f07726a5213fdf0bf0ae8b6b12b676b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a78e5a0164b37185d7cf25f07c0ea5a1fd6df0d23a1ff173bf085817d2f672f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zg72r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cm8wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.061491 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qpbx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"098f70a1-c2c2-44ce-9c0c-356e7eea2da9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6903b8a1bfbbe982436ccb9b24b019f8996d6229249cc62f025c32ecbb56efe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca0c39841636b80e8872853f9f5695cffa06ce37002e2eb03206a41a5b7be1a1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-27T17:37:53Z\\\",\\\"message\\\":\\\"2026-02-27T17:37:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_aa3147ee-560e-474f-bbcd-bd608ada4c24\\\\n2026-02-27T17:37:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_aa3147ee-560e-474f-bbcd-bd608ada4c24 to /host/opt/cni/bin/\\\\n2026-02-27T17:37:08Z [verbose] multus-daemon started\\\\n2026-02-27T17:37:08Z [verbose] Readiness Indicator file check\\\\n2026-02-27T17:37:53Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f5dqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qpbx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.081649 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2462e5c5-9a89-44fd-a239-ae0bcb28474b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9103e7281307aa7f6922806e847df158e614abbd9d2cfc02b79d88d99ef6125\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f34f57ccf635b4f036f46ae32672363739001a0ff840bfad06781d72976b74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa560d4b47bb3dd6066fa62fa08c6cb2e4e736e2e7be83517df2da33a3e18037\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cde8121f5098e2ad1799ff64e985d3bf8f7321b1136e669aa787d1c2970f6440\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cde8121f5098e2ad1799ff64e985d3bf8f7321b1136e669aa787d1c2970f6440\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.100357 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.118067 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a772c7aa8838bea02d77323a30292917c5106470a965f45d100e33e60b3d8e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.136601 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:11 crc kubenswrapper[4752]: E0227 17:38:11.153853 4752 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.160753 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ntzss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e5e2ad1-375b-4340-a583-e32742e736e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://252ad8ab6d056c4b6aa5591b2eb7b61105a3e0ac19b98097f6bd5766b98c7f7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f0756538e13e05c90dcc016ea8cda31506b3e0b4453b873e7c41cc37fff9644\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fd58ceeaae17c265eee7de1eec21c655a0b943f44aea6097d1b60ce2a058edb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b247927b0617991f638b6e930342ac3838a80713d2d7ad7a3ec1877f7ce9a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b944c2fbd5fc27162740f68cead7545c572492fe5395421b373be3aa4ff9e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57de17a1b406b6abee491a34676d6a0c202dcdb279859c811474fda02bf44297\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://590b059f184b66a468123967bf7fe298308ae95e481a3e9d9489cbebc4034a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://590b059f184b66a468123967bf7fe298308ae95e481a3e9d9489cbebc4034a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ntzss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.179773 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c7d88c1-f57d-4cd6-84b6-0d3a6d61366b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:36:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36a90f4c7c18fe68ad4aabb794bb89a722ba0e79f0d577e43cd4783b88781de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://baf0bb2bec21350587defabe8172bf2a438de43ad4101a7ec15ea515e7a2624d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-27T17:35:33Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0227 17:35:03.074302 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0227 17:35:03.076341 1 observer_polling.go:159] Starting file observer\\\\nI0227 17:35:03.114487 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0227 17:35:03.117744 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0227 17:35:33.288292 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0227 17:35:33.288564 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:35:32Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07158ca3ff468fe4b8147ba5f4a6ec9e775d214521e3d5a99d9e7dde15e760a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ffbf0795b45df43efea5060fcbdef64063b360d6841281eb78a5179c2f62ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc81955110624a62223bd05aebbad30eb6a8467b33de77a99daacdd0f6ab7f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.199578 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://345b6aa5640e3a483968ee51953b07e13323f9a6ffed421509dc7a806d526100\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.213527 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jkjwj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"937bbb35-a3c2-435c-86c5-1072f3a54595\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2jmrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jkjwj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.230569 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5vlk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80ecbe44-7a3a-4cf1-9be4-b2f304a4fade\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53dc23b1ea3fe03171bff4757ee01e66418bad442ac3a3489002239e543dbd6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccstl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5vlk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.245652 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82c5f928-ac2b-4288-ac82-5ff668d4212b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:35:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b99ad96105738ae86e79d21cbf8ce43e85d7410e44e757f70bf0ca3da252fa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:35:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1978f9a878a01d29db332bc246fdf21a24d355bcd8c8580cb3a72169b68278bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1978f9a878a01d29db332bc246fdf21a24d355bcd8c8580cb3a72169b68278bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-27T17:35:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-27T17:35:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:35:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.263943 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2472139744ba3e03c75f0553bd60e1cb207dc09b8476cb53d54ba4f4490080e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://272ec71ef97366504dfbfefb8cb3f687f41639740074ce3199b54d15ee328e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.312302 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.327281 4752 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28ad54bb-a1b9-4b0f-85d5-d2e656b16bdd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-27T17:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d3acd38b3425ce4a5d9befc6cc4e74b0192874e4bc99516544c8502e71be3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98009a8a13b1dfd8f9ceb1385dc6e4c9e209083771716793c0822fe287e5dfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-27T17:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fj5v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-27T17:37:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jpsg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.426750 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.426807 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.426829 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.426879 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.426902 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:38:11Z","lastTransitionTime":"2026-02-27T17:38:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:38:11 crc kubenswrapper[4752]: E0227 17:38:11.451616 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:38:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:38:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:38:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:38:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:38:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:38:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:38:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:38:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.455863 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.455948 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.455975 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.456007 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.456030 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:38:11Z","lastTransitionTime":"2026-02-27T17:38:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:38:11 crc kubenswrapper[4752]: E0227 17:38:11.472816 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:38:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:38:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:38:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:38:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:38:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:38:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:38:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:38:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.477395 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.477482 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.477507 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.477538 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.477564 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:38:11Z","lastTransitionTime":"2026-02-27T17:38:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:38:11 crc kubenswrapper[4752]: E0227 17:38:11.500407 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:38:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:38:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:38:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:38:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:38:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:38:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:38:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:38:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.504745 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.504818 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.504843 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.504875 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.504914 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:38:11Z","lastTransitionTime":"2026-02-27T17:38:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:38:11 crc kubenswrapper[4752]: E0227 17:38:11.527904 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:38:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:38:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:38:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:38:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:38:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:38:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:38:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:38:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.533435 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.533511 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.533538 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.533570 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.533595 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:38:11Z","lastTransitionTime":"2026-02-27T17:38:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:38:11 crc kubenswrapper[4752]: E0227 17:38:11.552092 4752 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:38:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:38:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:38:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:38:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:38:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:38:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-27T17:38:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-27T17:38:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"78085164-654a-4899-838b-cadb0192fc93\\\",\\\"systemUUID\\\":\\\"3997dbc0-568e-470a-afbe-a819259fb419\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-27T17:38:11Z is after 2025-08-24T17:21:41Z" Feb 27 17:38:11 crc kubenswrapper[4752]: E0227 17:38:11.552344 4752 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.906459 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.906500 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.906632 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:38:11 crc kubenswrapper[4752]: E0227 17:38:11.906854 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:38:11 crc kubenswrapper[4752]: E0227 17:38:11.906965 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:38:11 crc kubenswrapper[4752]: E0227 17:38:11.907183 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:38:11 crc kubenswrapper[4752]: I0227 17:38:11.907425 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:38:11 crc kubenswrapper[4752]: E0227 17:38:11.907580 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:38:13 crc kubenswrapper[4752]: I0227 17:38:13.906652 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:38:13 crc kubenswrapper[4752]: E0227 17:38:13.906844 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:38:13 crc kubenswrapper[4752]: I0227 17:38:13.906912 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:38:13 crc kubenswrapper[4752]: I0227 17:38:13.906941 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:38:13 crc kubenswrapper[4752]: I0227 17:38:13.907424 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:38:13 crc kubenswrapper[4752]: E0227 17:38:13.907441 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:38:13 crc kubenswrapper[4752]: E0227 17:38:13.907681 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:38:13 crc kubenswrapper[4752]: I0227 17:38:13.907826 4752 scope.go:117] "RemoveContainer" containerID="847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0" Feb 27 17:38:13 crc kubenswrapper[4752]: E0227 17:38:13.907922 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:38:13 crc kubenswrapper[4752]: E0227 17:38:13.908137 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 27 17:38:15 crc kubenswrapper[4752]: I0227 17:38:15.906741 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:38:15 crc kubenswrapper[4752]: I0227 17:38:15.906824 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:38:15 crc kubenswrapper[4752]: E0227 17:38:15.906926 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:38:15 crc kubenswrapper[4752]: I0227 17:38:15.907003 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:38:15 crc kubenswrapper[4752]: I0227 17:38:15.907001 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:38:15 crc kubenswrapper[4752]: E0227 17:38:15.907539 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:38:15 crc kubenswrapper[4752]: E0227 17:38:15.907698 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:38:15 crc kubenswrapper[4752]: E0227 17:38:15.907760 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:38:16 crc kubenswrapper[4752]: E0227 17:38:16.155468 4752 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 17:38:17 crc kubenswrapper[4752]: I0227 17:38:17.906567 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:38:17 crc kubenswrapper[4752]: I0227 17:38:17.906570 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:38:17 crc kubenswrapper[4752]: I0227 17:38:17.906695 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:38:17 crc kubenswrapper[4752]: I0227 17:38:17.906744 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:38:17 crc kubenswrapper[4752]: E0227 17:38:17.906750 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:38:17 crc kubenswrapper[4752]: E0227 17:38:17.906967 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:38:17 crc kubenswrapper[4752]: E0227 17:38:17.906986 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:38:17 crc kubenswrapper[4752]: E0227 17:38:17.907033 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:38:18 crc kubenswrapper[4752]: I0227 17:38:18.907444 4752 scope.go:117] "RemoveContainer" containerID="6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a" Feb 27 17:38:18 crc kubenswrapper[4752]: E0227 17:38:18.907754 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-sfztq_openshift-ovn-kubernetes(690b0de6-1f38-4265-bfff-2077a349f89c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" Feb 27 17:38:19 crc kubenswrapper[4752]: I0227 17:38:19.906093 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:38:19 crc kubenswrapper[4752]: I0227 17:38:19.906241 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:38:19 crc kubenswrapper[4752]: E0227 17:38:19.906336 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:38:19 crc kubenswrapper[4752]: I0227 17:38:19.906422 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:38:19 crc kubenswrapper[4752]: I0227 17:38:19.906450 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:38:19 crc kubenswrapper[4752]: E0227 17:38:19.906564 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:38:19 crc kubenswrapper[4752]: E0227 17:38:19.906751 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:38:19 crc kubenswrapper[4752]: E0227 17:38:19.906940 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:38:20 crc kubenswrapper[4752]: I0227 17:38:20.954460 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=29.954446019 podStartE2EDuration="29.954446019s" podCreationTimestamp="2026-02-27 17:37:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:38:20.953923516 +0000 UTC m=+200.860740377" watchObservedRunningTime="2026-02-27 17:38:20.954446019 +0000 UTC m=+200.861262870" Feb 27 17:38:21 crc kubenswrapper[4752]: I0227 17:38:21.025972 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-jlxqp" podStartSLOduration=145.025912004 podStartE2EDuration="2m25.025912004s" podCreationTimestamp="2026-02-27 17:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:38:21.00427424 +0000 UTC m=+200.911091161" watchObservedRunningTime="2026-02-27 17:38:21.025912004 +0000 UTC m=+200.932728895" Feb 27 17:38:21 crc kubenswrapper[4752]: I0227 17:38:21.027021 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podStartSLOduration=145.027009601 podStartE2EDuration="2m25.027009601s" podCreationTimestamp="2026-02-27 17:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:38:21.025852422 +0000 UTC m=+200.932669323" watchObservedRunningTime="2026-02-27 17:38:21.027009601 +0000 UTC m=+200.933826492" Feb 27 17:38:21 crc kubenswrapper[4752]: I0227 17:38:21.072323 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-qpbx6" podStartSLOduration=145.072293209 podStartE2EDuration="2m25.072293209s" podCreationTimestamp="2026-02-27 17:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:38:21.043463477 +0000 UTC m=+200.950280338" watchObservedRunningTime="2026-02-27 17:38:21.072293209 +0000 UTC m=+200.979110100" Feb 27 17:38:21 crc kubenswrapper[4752]: I0227 17:38:21.136345 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-ntzss" podStartSLOduration=145.13632635 podStartE2EDuration="2m25.13632635s" podCreationTimestamp="2026-02-27 17:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:38:21.135406578 +0000 UTC m=+201.042223439" watchObservedRunningTime="2026-02-27 17:38:21.13632635 +0000 UTC m=+201.043143211" Feb 27 17:38:21 crc kubenswrapper[4752]: E0227 17:38:21.156388 4752 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 17:38:21 crc kubenswrapper[4752]: I0227 17:38:21.158848 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=24.158822946 podStartE2EDuration="24.158822946s" podCreationTimestamp="2026-02-27 17:37:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:38:21.154744145 +0000 UTC m=+201.061560996" watchObservedRunningTime="2026-02-27 17:38:21.158822946 +0000 UTC m=+201.065639827" Feb 27 17:38:21 crc kubenswrapper[4752]: I0227 17:38:21.198764 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-5vlk6" podStartSLOduration=145.198741371 podStartE2EDuration="2m25.198741371s" podCreationTimestamp="2026-02-27 17:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:38:21.198611718 +0000 UTC m=+201.105428579" watchObservedRunningTime="2026-02-27 17:38:21.198741371 +0000 UTC m=+201.105558232" Feb 27 17:38:21 crc kubenswrapper[4752]: I0227 17:38:21.213341 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=57.213323182 podStartE2EDuration="57.213323182s" podCreationTimestamp="2026-02-27 17:37:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:38:21.212932202 +0000 UTC m=+201.119749103" watchObservedRunningTime="2026-02-27 17:38:21.213323182 +0000 UTC m=+201.120140023" Feb 27 17:38:21 crc kubenswrapper[4752]: I0227 17:38:21.258481 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jpsg7" podStartSLOduration=144.258465626 podStartE2EDuration="2m24.258465626s" podCreationTimestamp="2026-02-27 17:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:38:21.258181719 +0000 UTC m=+201.164998610" watchObservedRunningTime="2026-02-27 17:38:21.258465626 +0000 UTC m=+201.165282477" Feb 27 17:38:21 crc kubenswrapper[4752]: I0227 17:38:21.274916 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=54.274893682 podStartE2EDuration="54.274893682s" podCreationTimestamp="2026-02-27 17:37:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:38:21.274098702 +0000 UTC m=+201.180915563" watchObservedRunningTime="2026-02-27 17:38:21.274893682 +0000 UTC m=+201.181710563" Feb 27 17:38:21 crc kubenswrapper[4752]: I0227 17:38:21.906015 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:38:21 crc kubenswrapper[4752]: I0227 17:38:21.906101 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:38:21 crc kubenswrapper[4752]: I0227 17:38:21.906119 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:38:21 crc kubenswrapper[4752]: E0227 17:38:21.906192 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:38:21 crc kubenswrapper[4752]: I0227 17:38:21.906219 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:38:21 crc kubenswrapper[4752]: E0227 17:38:21.906401 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:38:21 crc kubenswrapper[4752]: E0227 17:38:21.906448 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:38:21 crc kubenswrapper[4752]: E0227 17:38:21.906520 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:38:21 crc kubenswrapper[4752]: I0227 17:38:21.949275 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 27 17:38:21 crc kubenswrapper[4752]: I0227 17:38:21.949401 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 27 17:38:21 crc kubenswrapper[4752]: I0227 17:38:21.949424 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 27 17:38:21 crc kubenswrapper[4752]: I0227 17:38:21.949447 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 27 17:38:21 crc kubenswrapper[4752]: I0227 17:38:21.949463 4752 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-27T17:38:21Z","lastTransitionTime":"2026-02-27T17:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 27 17:38:22 crc kubenswrapper[4752]: I0227 17:38:22.007822 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-mpfg4"] Feb 27 17:38:22 crc kubenswrapper[4752]: I0227 17:38:22.008456 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mpfg4" Feb 27 17:38:22 crc kubenswrapper[4752]: I0227 17:38:22.010902 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 27 17:38:22 crc kubenswrapper[4752]: I0227 17:38:22.011802 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 27 17:38:22 crc kubenswrapper[4752]: I0227 17:38:22.012007 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 27 17:38:22 crc kubenswrapper[4752]: I0227 17:38:22.012421 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 27 17:38:22 crc kubenswrapper[4752]: I0227 17:38:22.141717 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ba202cd5-dfdc-4319-adef-0d876cd6bd33-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-mpfg4\" (UID: \"ba202cd5-dfdc-4319-adef-0d876cd6bd33\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mpfg4" Feb 27 17:38:22 crc kubenswrapper[4752]: I0227 17:38:22.141787 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ba202cd5-dfdc-4319-adef-0d876cd6bd33-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-mpfg4\" (UID: \"ba202cd5-dfdc-4319-adef-0d876cd6bd33\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mpfg4" Feb 27 17:38:22 crc kubenswrapper[4752]: I0227 17:38:22.141863 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba202cd5-dfdc-4319-adef-0d876cd6bd33-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-mpfg4\" (UID: \"ba202cd5-dfdc-4319-adef-0d876cd6bd33\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mpfg4" Feb 27 17:38:22 crc kubenswrapper[4752]: I0227 17:38:22.141899 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ba202cd5-dfdc-4319-adef-0d876cd6bd33-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-mpfg4\" (UID: \"ba202cd5-dfdc-4319-adef-0d876cd6bd33\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mpfg4" Feb 27 17:38:22 crc kubenswrapper[4752]: I0227 17:38:22.141961 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba202cd5-dfdc-4319-adef-0d876cd6bd33-service-ca\") pod \"cluster-version-operator-5c965bbfc6-mpfg4\" (UID: \"ba202cd5-dfdc-4319-adef-0d876cd6bd33\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mpfg4" Feb 27 17:38:22 crc kubenswrapper[4752]: I0227 17:38:22.161499 4752 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 27 17:38:22 crc kubenswrapper[4752]: I0227 17:38:22.172369 4752 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 27 17:38:22 crc kubenswrapper[4752]: I0227 17:38:22.243089 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba202cd5-dfdc-4319-adef-0d876cd6bd33-service-ca\") pod \"cluster-version-operator-5c965bbfc6-mpfg4\" (UID: \"ba202cd5-dfdc-4319-adef-0d876cd6bd33\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mpfg4" Feb 27 17:38:22 crc kubenswrapper[4752]: I0227 17:38:22.243222 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ba202cd5-dfdc-4319-adef-0d876cd6bd33-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-mpfg4\" (UID: \"ba202cd5-dfdc-4319-adef-0d876cd6bd33\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mpfg4" Feb 27 17:38:22 crc kubenswrapper[4752]: I0227 17:38:22.243280 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ba202cd5-dfdc-4319-adef-0d876cd6bd33-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-mpfg4\" (UID: \"ba202cd5-dfdc-4319-adef-0d876cd6bd33\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mpfg4" Feb 27 17:38:22 crc kubenswrapper[4752]: I0227 17:38:22.243346 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba202cd5-dfdc-4319-adef-0d876cd6bd33-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-mpfg4\" (UID: \"ba202cd5-dfdc-4319-adef-0d876cd6bd33\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mpfg4" Feb 27 17:38:22 crc kubenswrapper[4752]: I0227 17:38:22.243393 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ba202cd5-dfdc-4319-adef-0d876cd6bd33-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-mpfg4\" (UID: \"ba202cd5-dfdc-4319-adef-0d876cd6bd33\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mpfg4" Feb 27 17:38:22 crc kubenswrapper[4752]: I0227 17:38:22.243411 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ba202cd5-dfdc-4319-adef-0d876cd6bd33-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-mpfg4\" (UID: \"ba202cd5-dfdc-4319-adef-0d876cd6bd33\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mpfg4" Feb 27 17:38:22 crc kubenswrapper[4752]: I0227 17:38:22.243520 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ba202cd5-dfdc-4319-adef-0d876cd6bd33-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-mpfg4\" (UID: \"ba202cd5-dfdc-4319-adef-0d876cd6bd33\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mpfg4" Feb 27 17:38:22 crc kubenswrapper[4752]: I0227 17:38:22.244825 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba202cd5-dfdc-4319-adef-0d876cd6bd33-service-ca\") pod \"cluster-version-operator-5c965bbfc6-mpfg4\" (UID: \"ba202cd5-dfdc-4319-adef-0d876cd6bd33\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mpfg4" Feb 27 17:38:22 crc kubenswrapper[4752]: I0227 17:38:22.251684 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba202cd5-dfdc-4319-adef-0d876cd6bd33-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-mpfg4\" (UID: \"ba202cd5-dfdc-4319-adef-0d876cd6bd33\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mpfg4" Feb 27 17:38:22 crc kubenswrapper[4752]: I0227 17:38:22.263252 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ba202cd5-dfdc-4319-adef-0d876cd6bd33-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-mpfg4\" (UID: \"ba202cd5-dfdc-4319-adef-0d876cd6bd33\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mpfg4" Feb 27 17:38:22 crc kubenswrapper[4752]: I0227 17:38:22.327392 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mpfg4" Feb 27 17:38:23 crc kubenswrapper[4752]: I0227 17:38:23.279721 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mpfg4" event={"ID":"ba202cd5-dfdc-4319-adef-0d876cd6bd33","Type":"ContainerStarted","Data":"a7270ff2810c84afe44e24bed9d226a08cfa8f20cd76cb318ed1e4f6585d86e2"} Feb 27 17:38:23 crc kubenswrapper[4752]: I0227 17:38:23.280049 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mpfg4" event={"ID":"ba202cd5-dfdc-4319-adef-0d876cd6bd33","Type":"ContainerStarted","Data":"006626972c78a9520b63a83304fb2dd32182e1d7811f282c37a555ada5da7bf9"} Feb 27 17:38:23 crc kubenswrapper[4752]: I0227 17:38:23.299863 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mpfg4" podStartSLOduration=147.299833321 podStartE2EDuration="2m27.299833321s" podCreationTimestamp="2026-02-27 17:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:38:23.298457637 +0000 UTC m=+203.205274498" watchObservedRunningTime="2026-02-27 17:38:23.299833321 +0000 UTC m=+203.206650212" Feb 27 17:38:23 crc kubenswrapper[4752]: I0227 17:38:23.905833 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:38:23 crc kubenswrapper[4752]: I0227 17:38:23.905934 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:38:23 crc kubenswrapper[4752]: I0227 17:38:23.905938 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:38:23 crc kubenswrapper[4752]: I0227 17:38:23.906132 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:38:23 crc kubenswrapper[4752]: E0227 17:38:23.906119 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:38:23 crc kubenswrapper[4752]: E0227 17:38:23.906307 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:38:23 crc kubenswrapper[4752]: E0227 17:38:23.906371 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:38:23 crc kubenswrapper[4752]: E0227 17:38:23.906470 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:38:25 crc kubenswrapper[4752]: I0227 17:38:25.905863 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:38:25 crc kubenswrapper[4752]: I0227 17:38:25.905897 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:38:25 crc kubenswrapper[4752]: I0227 17:38:25.906006 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:38:25 crc kubenswrapper[4752]: E0227 17:38:25.906055 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:38:25 crc kubenswrapper[4752]: E0227 17:38:25.906084 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:38:25 crc kubenswrapper[4752]: E0227 17:38:25.906187 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:38:25 crc kubenswrapper[4752]: I0227 17:38:25.906439 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:38:25 crc kubenswrapper[4752]: E0227 17:38:25.906647 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:38:26 crc kubenswrapper[4752]: E0227 17:38:26.157532 4752 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 17:38:27 crc kubenswrapper[4752]: I0227 17:38:27.906624 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:38:27 crc kubenswrapper[4752]: E0227 17:38:27.906812 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:38:27 crc kubenswrapper[4752]: I0227 17:38:27.907369 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:38:27 crc kubenswrapper[4752]: I0227 17:38:27.907456 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:38:27 crc kubenswrapper[4752]: I0227 17:38:27.907502 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:38:27 crc kubenswrapper[4752]: E0227 17:38:27.907603 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:38:27 crc kubenswrapper[4752]: E0227 17:38:27.907679 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:38:27 crc kubenswrapper[4752]: E0227 17:38:27.907752 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:38:28 crc kubenswrapper[4752]: I0227 17:38:28.906169 4752 scope.go:117] "RemoveContainer" containerID="847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0" Feb 27 17:38:29 crc kubenswrapper[4752]: I0227 17:38:29.302728 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Feb 27 17:38:29 crc kubenswrapper[4752]: I0227 17:38:29.305344 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ad318061566db9fcc86af89349dee0916a2f6344e6e374e356f5929c30b1150d"} Feb 27 17:38:29 crc kubenswrapper[4752]: I0227 17:38:29.305781 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:38:29 crc kubenswrapper[4752]: I0227 17:38:29.906712 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:38:29 crc kubenswrapper[4752]: I0227 17:38:29.906842 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:38:29 crc kubenswrapper[4752]: E0227 17:38:29.906890 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:38:29 crc kubenswrapper[4752]: E0227 17:38:29.907074 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:38:29 crc kubenswrapper[4752]: I0227 17:38:29.906739 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:38:29 crc kubenswrapper[4752]: E0227 17:38:29.907265 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:38:29 crc kubenswrapper[4752]: I0227 17:38:29.906711 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:38:29 crc kubenswrapper[4752]: E0227 17:38:29.907415 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:38:30 crc kubenswrapper[4752]: I0227 17:38:30.909231 4752 scope.go:117] "RemoveContainer" containerID="6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a" Feb 27 17:38:30 crc kubenswrapper[4752]: E0227 17:38:30.909564 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-sfztq_openshift-ovn-kubernetes(690b0de6-1f38-4265-bfff-2077a349f89c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" Feb 27 17:38:31 crc kubenswrapper[4752]: E0227 17:38:31.159131 4752 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 17:38:31 crc kubenswrapper[4752]: I0227 17:38:31.906703 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:38:31 crc kubenswrapper[4752]: I0227 17:38:31.906829 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:38:31 crc kubenswrapper[4752]: E0227 17:38:31.906904 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:38:31 crc kubenswrapper[4752]: I0227 17:38:31.906922 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:38:31 crc kubenswrapper[4752]: I0227 17:38:31.907004 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:38:31 crc kubenswrapper[4752]: E0227 17:38:31.907250 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:38:31 crc kubenswrapper[4752]: E0227 17:38:31.907578 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:38:31 crc kubenswrapper[4752]: E0227 17:38:31.908591 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:38:33 crc kubenswrapper[4752]: I0227 17:38:33.906557 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:38:33 crc kubenswrapper[4752]: I0227 17:38:33.906559 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:38:33 crc kubenswrapper[4752]: E0227 17:38:33.907639 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:38:33 crc kubenswrapper[4752]: I0227 17:38:33.906697 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:38:33 crc kubenswrapper[4752]: I0227 17:38:33.906577 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:38:33 crc kubenswrapper[4752]: E0227 17:38:33.907849 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:38:33 crc kubenswrapper[4752]: E0227 17:38:33.908034 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:38:33 crc kubenswrapper[4752]: E0227 17:38:33.908194 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:38:35 crc kubenswrapper[4752]: I0227 17:38:35.906035 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:38:35 crc kubenswrapper[4752]: I0227 17:38:35.906133 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:38:35 crc kubenswrapper[4752]: E0227 17:38:35.906372 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:38:35 crc kubenswrapper[4752]: I0227 17:38:35.906417 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:38:35 crc kubenswrapper[4752]: I0227 17:38:35.906528 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:38:35 crc kubenswrapper[4752]: E0227 17:38:35.906685 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:38:35 crc kubenswrapper[4752]: E0227 17:38:35.906804 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:38:35 crc kubenswrapper[4752]: E0227 17:38:35.907257 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:38:36 crc kubenswrapper[4752]: E0227 17:38:36.160816 4752 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 17:38:37 crc kubenswrapper[4752]: I0227 17:38:37.906342 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:38:37 crc kubenswrapper[4752]: I0227 17:38:37.906392 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:38:37 crc kubenswrapper[4752]: I0227 17:38:37.906392 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:38:37 crc kubenswrapper[4752]: I0227 17:38:37.906996 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:38:37 crc kubenswrapper[4752]: E0227 17:38:37.907339 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:38:37 crc kubenswrapper[4752]: E0227 17:38:37.907464 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:38:37 crc kubenswrapper[4752]: E0227 17:38:37.907585 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:38:37 crc kubenswrapper[4752]: E0227 17:38:37.907714 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:38:39 crc kubenswrapper[4752]: I0227 17:38:39.345746 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qpbx6_098f70a1-c2c2-44ce-9c0c-356e7eea2da9/kube-multus/1.log" Feb 27 17:38:39 crc kubenswrapper[4752]: I0227 17:38:39.346633 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qpbx6_098f70a1-c2c2-44ce-9c0c-356e7eea2da9/kube-multus/0.log" Feb 27 17:38:39 crc kubenswrapper[4752]: I0227 17:38:39.346699 4752 generic.go:334] "Generic (PLEG): container finished" podID="098f70a1-c2c2-44ce-9c0c-356e7eea2da9" containerID="d6903b8a1bfbbe982436ccb9b24b019f8996d6229249cc62f025c32ecbb56efe" exitCode=1 Feb 27 17:38:39 crc kubenswrapper[4752]: I0227 17:38:39.346750 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qpbx6" event={"ID":"098f70a1-c2c2-44ce-9c0c-356e7eea2da9","Type":"ContainerDied","Data":"d6903b8a1bfbbe982436ccb9b24b019f8996d6229249cc62f025c32ecbb56efe"} Feb 27 17:38:39 crc kubenswrapper[4752]: I0227 17:38:39.346801 4752 scope.go:117] "RemoveContainer" containerID="ca0c39841636b80e8872853f9f5695cffa06ce37002e2eb03206a41a5b7be1a1" Feb 27 17:38:39 crc kubenswrapper[4752]: I0227 17:38:39.347605 4752 scope.go:117] "RemoveContainer" containerID="d6903b8a1bfbbe982436ccb9b24b019f8996d6229249cc62f025c32ecbb56efe" Feb 27 17:38:39 crc kubenswrapper[4752]: E0227 17:38:39.347964 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-qpbx6_openshift-multus(098f70a1-c2c2-44ce-9c0c-356e7eea2da9)\"" pod="openshift-multus/multus-qpbx6" podUID="098f70a1-c2c2-44ce-9c0c-356e7eea2da9" Feb 27 17:38:39 crc kubenswrapper[4752]: I0227 17:38:39.380704 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=90.380676525 podStartE2EDuration="1m30.380676525s" podCreationTimestamp="2026-02-27 17:37:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:38:29.334623231 +0000 UTC m=+209.241440112" watchObservedRunningTime="2026-02-27 17:38:39.380676525 +0000 UTC m=+219.287493406" Feb 27 17:38:39 crc kubenswrapper[4752]: I0227 17:38:39.906419 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:38:39 crc kubenswrapper[4752]: I0227 17:38:39.906502 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:38:39 crc kubenswrapper[4752]: E0227 17:38:39.906679 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:38:39 crc kubenswrapper[4752]: I0227 17:38:39.906736 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:38:39 crc kubenswrapper[4752]: I0227 17:38:39.906797 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:38:39 crc kubenswrapper[4752]: E0227 17:38:39.906970 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:38:39 crc kubenswrapper[4752]: E0227 17:38:39.907269 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:38:39 crc kubenswrapper[4752]: E0227 17:38:39.907423 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:38:40 crc kubenswrapper[4752]: I0227 17:38:40.353185 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qpbx6_098f70a1-c2c2-44ce-9c0c-356e7eea2da9/kube-multus/1.log" Feb 27 17:38:41 crc kubenswrapper[4752]: E0227 17:38:41.162639 4752 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 17:38:41 crc kubenswrapper[4752]: I0227 17:38:41.906085 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:38:41 crc kubenswrapper[4752]: I0227 17:38:41.906131 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:38:41 crc kubenswrapper[4752]: I0227 17:38:41.906327 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:38:41 crc kubenswrapper[4752]: E0227 17:38:41.906456 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:38:41 crc kubenswrapper[4752]: I0227 17:38:41.906510 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:38:41 crc kubenswrapper[4752]: E0227 17:38:41.906671 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:38:41 crc kubenswrapper[4752]: E0227 17:38:41.906837 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:38:41 crc kubenswrapper[4752]: E0227 17:38:41.907406 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:38:41 crc kubenswrapper[4752]: I0227 17:38:41.907879 4752 scope.go:117] "RemoveContainer" containerID="6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a" Feb 27 17:38:41 crc kubenswrapper[4752]: E0227 17:38:41.908129 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-sfztq_openshift-ovn-kubernetes(690b0de6-1f38-4265-bfff-2077a349f89c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" Feb 27 17:38:43 crc kubenswrapper[4752]: I0227 17:38:43.905902 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:38:43 crc kubenswrapper[4752]: I0227 17:38:43.905962 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:38:43 crc kubenswrapper[4752]: I0227 17:38:43.905975 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:38:43 crc kubenswrapper[4752]: I0227 17:38:43.905920 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:38:43 crc kubenswrapper[4752]: E0227 17:38:43.906212 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:38:43 crc kubenswrapper[4752]: E0227 17:38:43.906346 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:38:43 crc kubenswrapper[4752]: E0227 17:38:43.906503 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:38:43 crc kubenswrapper[4752]: E0227 17:38:43.906681 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:38:44 crc kubenswrapper[4752]: I0227 17:38:44.193128 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:38:45 crc kubenswrapper[4752]: I0227 17:38:45.905924 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:38:45 crc kubenswrapper[4752]: E0227 17:38:45.906109 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:38:45 crc kubenswrapper[4752]: I0227 17:38:45.906379 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:38:45 crc kubenswrapper[4752]: E0227 17:38:45.906440 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:38:45 crc kubenswrapper[4752]: I0227 17:38:45.906587 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:38:45 crc kubenswrapper[4752]: E0227 17:38:45.906651 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:38:45 crc kubenswrapper[4752]: I0227 17:38:45.906794 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:38:45 crc kubenswrapper[4752]: E0227 17:38:45.906863 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:38:46 crc kubenswrapper[4752]: E0227 17:38:46.164894 4752 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 17:38:47 crc kubenswrapper[4752]: I0227 17:38:47.906761 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:38:47 crc kubenswrapper[4752]: I0227 17:38:47.906843 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:38:47 crc kubenswrapper[4752]: I0227 17:38:47.906768 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:38:47 crc kubenswrapper[4752]: E0227 17:38:47.907062 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:38:47 crc kubenswrapper[4752]: E0227 17:38:47.907904 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:38:47 crc kubenswrapper[4752]: I0227 17:38:47.907988 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:38:47 crc kubenswrapper[4752]: E0227 17:38:47.908066 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:38:47 crc kubenswrapper[4752]: E0227 17:38:47.908255 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:38:49 crc kubenswrapper[4752]: I0227 17:38:49.906790 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:38:49 crc kubenswrapper[4752]: I0227 17:38:49.906897 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:38:49 crc kubenswrapper[4752]: I0227 17:38:49.906804 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:38:49 crc kubenswrapper[4752]: I0227 17:38:49.906835 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:38:49 crc kubenswrapper[4752]: E0227 17:38:49.907095 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:38:49 crc kubenswrapper[4752]: E0227 17:38:49.907312 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:38:49 crc kubenswrapper[4752]: E0227 17:38:49.907459 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:38:49 crc kubenswrapper[4752]: E0227 17:38:49.907598 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:38:51 crc kubenswrapper[4752]: E0227 17:38:51.166827 4752 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 17:38:51 crc kubenswrapper[4752]: I0227 17:38:51.906711 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:38:51 crc kubenswrapper[4752]: I0227 17:38:51.906761 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:38:51 crc kubenswrapper[4752]: I0227 17:38:51.906787 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:38:51 crc kubenswrapper[4752]: E0227 17:38:51.906921 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:38:51 crc kubenswrapper[4752]: I0227 17:38:51.907081 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:38:51 crc kubenswrapper[4752]: E0227 17:38:51.907397 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:38:51 crc kubenswrapper[4752]: E0227 17:38:51.907971 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:38:51 crc kubenswrapper[4752]: E0227 17:38:51.908318 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:38:53 crc kubenswrapper[4752]: I0227 17:38:53.906539 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:38:53 crc kubenswrapper[4752]: I0227 17:38:53.906604 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:38:53 crc kubenswrapper[4752]: E0227 17:38:53.906739 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:38:53 crc kubenswrapper[4752]: I0227 17:38:53.906772 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:38:53 crc kubenswrapper[4752]: E0227 17:38:53.906962 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:38:53 crc kubenswrapper[4752]: E0227 17:38:53.907073 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:38:53 crc kubenswrapper[4752]: I0227 17:38:53.907427 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:38:53 crc kubenswrapper[4752]: E0227 17:38:53.907587 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:38:54 crc kubenswrapper[4752]: I0227 17:38:54.907132 4752 scope.go:117] "RemoveContainer" containerID="d6903b8a1bfbbe982436ccb9b24b019f8996d6229249cc62f025c32ecbb56efe" Feb 27 17:38:55 crc kubenswrapper[4752]: I0227 17:38:55.413673 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qpbx6_098f70a1-c2c2-44ce-9c0c-356e7eea2da9/kube-multus/1.log" Feb 27 17:38:55 crc kubenswrapper[4752]: I0227 17:38:55.414139 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qpbx6" event={"ID":"098f70a1-c2c2-44ce-9c0c-356e7eea2da9","Type":"ContainerStarted","Data":"5c2dfd87b1efc712de9db66e893f49e0c21e3f77daea298231d059ff786e13ea"} Feb 27 17:38:55 crc kubenswrapper[4752]: I0227 17:38:55.905685 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:38:55 crc kubenswrapper[4752]: I0227 17:38:55.905734 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:38:55 crc kubenswrapper[4752]: E0227 17:38:55.905863 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:38:55 crc kubenswrapper[4752]: I0227 17:38:55.905941 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:38:55 crc kubenswrapper[4752]: I0227 17:38:55.905963 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:38:55 crc kubenswrapper[4752]: E0227 17:38:55.906051 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:38:55 crc kubenswrapper[4752]: E0227 17:38:55.906301 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:38:55 crc kubenswrapper[4752]: E0227 17:38:55.906352 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:38:56 crc kubenswrapper[4752]: E0227 17:38:56.168866 4752 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 27 17:38:56 crc kubenswrapper[4752]: I0227 17:38:56.908200 4752 scope.go:117] "RemoveContainer" containerID="6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a" Feb 27 17:38:57 crc kubenswrapper[4752]: I0227 17:38:57.422341 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sfztq_690b0de6-1f38-4265-bfff-2077a349f89c/ovnkube-controller/3.log" Feb 27 17:38:57 crc kubenswrapper[4752]: I0227 17:38:57.425647 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" event={"ID":"690b0de6-1f38-4265-bfff-2077a349f89c","Type":"ContainerStarted","Data":"3a394d383b6a193745c850d33d33919b1f9dd811dea8d55f09a9724091edcf21"} Feb 27 17:38:57 crc kubenswrapper[4752]: I0227 17:38:57.426329 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:38:57 crc kubenswrapper[4752]: I0227 17:38:57.465502 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" podStartSLOduration=181.465483711 podStartE2EDuration="3m1.465483711s" podCreationTimestamp="2026-02-27 17:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:38:57.465432709 +0000 UTC m=+237.372249600" watchObservedRunningTime="2026-02-27 17:38:57.465483711 +0000 UTC m=+237.372300582" Feb 27 17:38:57 crc kubenswrapper[4752]: I0227 17:38:57.869693 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jkjwj"] Feb 27 17:38:57 crc kubenswrapper[4752]: I0227 17:38:57.869854 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:38:57 crc kubenswrapper[4752]: E0227 17:38:57.870037 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:38:57 crc kubenswrapper[4752]: I0227 17:38:57.905910 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:38:57 crc kubenswrapper[4752]: E0227 17:38:57.906072 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:38:57 crc kubenswrapper[4752]: I0227 17:38:57.906185 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:38:57 crc kubenswrapper[4752]: E0227 17:38:57.906264 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:38:57 crc kubenswrapper[4752]: I0227 17:38:57.906335 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:38:57 crc kubenswrapper[4752]: E0227 17:38:57.906412 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:38:59 crc kubenswrapper[4752]: I0227 17:38:59.905982 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:38:59 crc kubenswrapper[4752]: I0227 17:38:59.906095 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:38:59 crc kubenswrapper[4752]: E0227 17:38:59.906617 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 27 17:38:59 crc kubenswrapper[4752]: I0227 17:38:59.906274 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:38:59 crc kubenswrapper[4752]: I0227 17:38:59.906274 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:38:59 crc kubenswrapper[4752]: E0227 17:38:59.906831 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 27 17:38:59 crc kubenswrapper[4752]: E0227 17:38:59.907214 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jkjwj" podUID="937bbb35-a3c2-435c-86c5-1072f3a54595" Feb 27 17:38:59 crc kubenswrapper[4752]: E0227 17:38:59.907318 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 27 17:39:01 crc kubenswrapper[4752]: I0227 17:39:01.906386 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:39:01 crc kubenswrapper[4752]: I0227 17:39:01.906555 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:39:01 crc kubenswrapper[4752]: I0227 17:39:01.906386 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:39:01 crc kubenswrapper[4752]: I0227 17:39:01.907934 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:39:01 crc kubenswrapper[4752]: I0227 17:39:01.909644 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 27 17:39:01 crc kubenswrapper[4752]: I0227 17:39:01.910558 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 27 17:39:01 crc kubenswrapper[4752]: I0227 17:39:01.910746 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 27 17:39:01 crc kubenswrapper[4752]: I0227 17:39:01.911505 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 27 17:39:01 crc kubenswrapper[4752]: I0227 17:39:01.911848 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 27 17:39:01 crc kubenswrapper[4752]: I0227 17:39:01.911939 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.818742 4752 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.892411 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ccqh7"] Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.893648 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4vxw"] Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.894185 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccqh7" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.894574 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4vxw" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.895662 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-hrbkp"] Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.897059 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q42ms"] Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.897732 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q42ms" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.898370 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.949248 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-tjktn"] Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.962967 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.963522 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.966529 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8r7pq"] Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.967399 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dvlkw"] Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.967823 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x5tb9"] Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.968374 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tbqcp"] Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.968734 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fb9f6"] Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.969192 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-lhljv"] Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.969651 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-lhljv" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.970127 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-tjktn" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.970368 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.973795 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-g96w7"] Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.970668 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.982528 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-krm5q"] Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.975216 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.975282 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x5tb9" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.975310 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tbqcp" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.975833 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0480be6e-a859-4bd7-8aad-0a7e5bf06a0e-serving-cert\") pod \"apiserver-76f77b778f-hrbkp\" (UID: \"0480be6e-a859-4bd7-8aad-0a7e5bf06a0e\") " pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.982874 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/fdb18d1a-6b47-4b81-808a-f6458470a201-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ccqh7\" (UID: \"fdb18d1a-6b47-4b81-808a-f6458470a201\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccqh7" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.982912 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fdb18d1a-6b47-4b81-808a-f6458470a201-audit-policies\") pod \"apiserver-7bbb656c7d-ccqh7\" (UID: \"fdb18d1a-6b47-4b81-808a-f6458470a201\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccqh7" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.982937 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdb18d1a-6b47-4b81-808a-f6458470a201-serving-cert\") pod \"apiserver-7bbb656c7d-ccqh7\" (UID: \"fdb18d1a-6b47-4b81-808a-f6458470a201\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccqh7" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.982962 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88f93b49-a038-467e-aebb-eecd7b9f307c-config\") pod \"openshift-apiserver-operator-796bbdcf4f-q42ms\" (UID: \"88f93b49-a038-467e-aebb-eecd7b9f307c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q42ms" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.982995 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0480be6e-a859-4bd7-8aad-0a7e5bf06a0e-audit-dir\") pod \"apiserver-76f77b778f-hrbkp\" (UID: \"0480be6e-a859-4bd7-8aad-0a7e5bf06a0e\") " pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.983029 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj7nt\" (UniqueName: \"kubernetes.io/projected/0480be6e-a859-4bd7-8aad-0a7e5bf06a0e-kube-api-access-qj7nt\") pod \"apiserver-76f77b778f-hrbkp\" (UID: \"0480be6e-a859-4bd7-8aad-0a7e5bf06a0e\") " pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.983086 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/621fbcf2-089a-44b7-8130-e4f188d4b03f-client-ca\") pod \"route-controller-manager-6576b87f9c-x4vxw\" (UID: \"621fbcf2-089a-44b7-8130-e4f188d4b03f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4vxw" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.983129 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0480be6e-a859-4bd7-8aad-0a7e5bf06a0e-node-pullsecrets\") pod \"apiserver-76f77b778f-hrbkp\" (UID: \"0480be6e-a859-4bd7-8aad-0a7e5bf06a0e\") " pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.983172 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0480be6e-a859-4bd7-8aad-0a7e5bf06a0e-etcd-serving-ca\") pod \"apiserver-76f77b778f-hrbkp\" (UID: \"0480be6e-a859-4bd7-8aad-0a7e5bf06a0e\") " pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.983208 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fdb18d1a-6b47-4b81-808a-f6458470a201-etcd-client\") pod \"apiserver-7bbb656c7d-ccqh7\" (UID: \"fdb18d1a-6b47-4b81-808a-f6458470a201\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccqh7" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.975180 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fb9f6" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.970860 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.983234 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/fdb18d1a-6b47-4b81-808a-f6458470a201-encryption-config\") pod \"apiserver-7bbb656c7d-ccqh7\" (UID: \"fdb18d1a-6b47-4b81-808a-f6458470a201\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccqh7" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.983404 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdhhf\" (UniqueName: \"kubernetes.io/projected/621fbcf2-089a-44b7-8130-e4f188d4b03f-kube-api-access-kdhhf\") pod \"route-controller-manager-6576b87f9c-x4vxw\" (UID: \"621fbcf2-089a-44b7-8130-e4f188d4b03f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4vxw" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.983447 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/621fbcf2-089a-44b7-8130-e4f188d4b03f-config\") pod \"route-controller-manager-6576b87f9c-x4vxw\" (UID: \"621fbcf2-089a-44b7-8130-e4f188d4b03f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4vxw" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.983487 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0480be6e-a859-4bd7-8aad-0a7e5bf06a0e-config\") pod \"apiserver-76f77b778f-hrbkp\" (UID: \"0480be6e-a859-4bd7-8aad-0a7e5bf06a0e\") " pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.974366 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.983546 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0480be6e-a859-4bd7-8aad-0a7e5bf06a0e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-hrbkp\" (UID: \"0480be6e-a859-4bd7-8aad-0a7e5bf06a0e\") " pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.983587 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88f93b49-a038-467e-aebb-eecd7b9f307c-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-q42ms\" (UID: \"88f93b49-a038-467e-aebb-eecd7b9f307c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q42ms" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.974428 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.983618 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fdb18d1a-6b47-4b81-808a-f6458470a201-audit-dir\") pod \"apiserver-7bbb656c7d-ccqh7\" (UID: \"fdb18d1a-6b47-4b81-808a-f6458470a201\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccqh7" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.983750 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/0480be6e-a859-4bd7-8aad-0a7e5bf06a0e-image-import-ca\") pod \"apiserver-76f77b778f-hrbkp\" (UID: \"0480be6e-a859-4bd7-8aad-0a7e5bf06a0e\") " pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.983767 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g96w7" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.983805 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmmm7\" (UniqueName: \"kubernetes.io/projected/fdb18d1a-6b47-4b81-808a-f6458470a201-kube-api-access-hmmm7\") pod \"apiserver-7bbb656c7d-ccqh7\" (UID: \"fdb18d1a-6b47-4b81-808a-f6458470a201\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccqh7" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.983831 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0480be6e-a859-4bd7-8aad-0a7e5bf06a0e-etcd-client\") pod \"apiserver-76f77b778f-hrbkp\" (UID: \"0480be6e-a859-4bd7-8aad-0a7e5bf06a0e\") " pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.983857 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5kpc\" (UniqueName: \"kubernetes.io/projected/88f93b49-a038-467e-aebb-eecd7b9f307c-kube-api-access-l5kpc\") pod \"openshift-apiserver-operator-796bbdcf4f-q42ms\" (UID: \"88f93b49-a038-467e-aebb-eecd7b9f307c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q42ms" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.983889 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0480be6e-a859-4bd7-8aad-0a7e5bf06a0e-encryption-config\") pod \"apiserver-76f77b778f-hrbkp\" (UID: \"0480be6e-a859-4bd7-8aad-0a7e5bf06a0e\") " pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.983932 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/621fbcf2-089a-44b7-8130-e4f188d4b03f-serving-cert\") pod \"route-controller-manager-6576b87f9c-x4vxw\" (UID: \"621fbcf2-089a-44b7-8130-e4f188d4b03f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4vxw" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.983982 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fdb18d1a-6b47-4b81-808a-f6458470a201-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ccqh7\" (UID: \"fdb18d1a-6b47-4b81-808a-f6458470a201\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccqh7" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.984044 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0480be6e-a859-4bd7-8aad-0a7e5bf06a0e-audit\") pod \"apiserver-76f77b778f-hrbkp\" (UID: \"0480be6e-a859-4bd7-8aad-0a7e5bf06a0e\") " pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.975249 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-dvlkw" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.974530 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.974561 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.974593 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.974600 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.974618 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.974777 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.974790 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.974911 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.975212 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.975218 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.975309 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.975710 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.976082 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.976219 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.976418 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.976455 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.976518 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.976531 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.976632 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.976671 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.976707 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.976809 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.976846 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.987968 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mnfsk"] Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.988505 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-zj6td"] Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.988568 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.988627 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mnfsk" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.988742 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.988627 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-krm5q" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.989379 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-zj6td" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.991227 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4r7b2"] Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.991895 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4r7b2" Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.992776 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lfrdd"] Feb 27 17:39:02 crc kubenswrapper[4752]: I0227 17:39:02.999942 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-cfp2v"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.000437 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-rxzmm"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.000551 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lfrdd" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.000869 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-rxzmm" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.001105 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-cfp2v" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.001860 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-jsq6c"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.002329 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-jsq6c" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.014529 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.014820 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.014941 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lcgcn"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.015503 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-r47g5"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.015874 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.016106 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lcgcn" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.022583 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qxgv7"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.023287 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qxgv7" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.027049 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pwkv"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.027408 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.027843 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lps47"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.028375 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lps47" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.028794 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pwkv" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.049794 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.055845 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.056548 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.057477 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.057720 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.058028 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.058040 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.059744 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.059932 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.061252 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.061424 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.061701 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.061958 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.062109 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.062282 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.062487 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.062620 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.062870 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.063449 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.063680 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.063798 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.063967 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.064266 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.064425 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.064665 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.065400 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.065850 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.066363 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.066914 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.067107 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.067550 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-jqd62"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.068569 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.080755 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9s4n4"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.081448 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqd62" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.082609 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9s4n4" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.086472 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.086558 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.086669 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/02863d54-8b48-4358-8dfe-b43269b1da31-audit-dir\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.086727 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2d1dabd3-4307-468d-86d9-01a1ac2e3539-console-config\") pod \"console-f9d7485db-zj6td\" (UID: \"2d1dabd3-4307-468d-86d9-01a1ac2e3539\") " pod="openshift-console/console-f9d7485db-zj6td" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.086751 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/7d737ed6-90b8-4607-bf34-a21992a704e6-etcd-service-ca\") pod \"etcd-operator-b45778765-rxzmm\" (UID: \"7d737ed6-90b8-4607-bf34-a21992a704e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rxzmm" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.087001 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a4eceb80-1269-438b-ad35-1a125e8b98c9-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4r7b2\" (UID: \"a4eceb80-1269-438b-ad35-1a125e8b98c9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4r7b2" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.087050 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52c3bbf7-f787-4b3f-8028-cdee09aba43e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mnfsk\" (UID: \"52c3bbf7-f787-4b3f-8028-cdee09aba43e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mnfsk" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.087069 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.088462 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6vtk\" (UniqueName: \"kubernetes.io/projected/3083b21b-220e-4439-a3c1-18c79f073151-kube-api-access-q6vtk\") pod \"downloads-7954f5f757-cfp2v\" (UID: \"3083b21b-220e-4439-a3c1-18c79f073151\") " pod="openshift-console/downloads-7954f5f757-cfp2v" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.088499 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh79f\" (UniqueName: \"kubernetes.io/projected/7d737ed6-90b8-4607-bf34-a21992a704e6-kube-api-access-kh79f\") pod \"etcd-operator-b45778765-rxzmm\" (UID: \"7d737ed6-90b8-4607-bf34-a21992a704e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rxzmm" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.088532 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2d1dabd3-4307-468d-86d9-01a1ac2e3539-console-serving-cert\") pod \"console-f9d7485db-zj6td\" (UID: \"2d1dabd3-4307-468d-86d9-01a1ac2e3539\") " pod="openshift-console/console-f9d7485db-zj6td" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.088555 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjgqk\" (UniqueName: \"kubernetes.io/projected/bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b-kube-api-access-kjgqk\") pod \"controller-manager-879f6c89f-tbqcp\" (UID: \"bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tbqcp" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.090248 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2g4j\" (UniqueName: \"kubernetes.io/projected/2d1dabd3-4307-468d-86d9-01a1ac2e3539-kube-api-access-f2g4j\") pod \"console-f9d7485db-zj6td\" (UID: \"2d1dabd3-4307-468d-86d9-01a1ac2e3539\") " pod="openshift-console/console-f9d7485db-zj6td" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.090318 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0480be6e-a859-4bd7-8aad-0a7e5bf06a0e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-hrbkp\" (UID: \"0480be6e-a859-4bd7-8aad-0a7e5bf06a0e\") " pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.090391 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/faf810e6-431a-40f6-b5fd-83b74fffc701-machine-approver-tls\") pod \"machine-approver-56656f9798-g96w7\" (UID: \"faf810e6-431a-40f6-b5fd-83b74fffc701\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g96w7" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.090737 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxprw\" (UniqueName: \"kubernetes.io/projected/faf810e6-431a-40f6-b5fd-83b74fffc701-kube-api-access-xxprw\") pod \"machine-approver-56656f9798-g96w7\" (UID: \"faf810e6-431a-40f6-b5fd-83b74fffc701\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g96w7" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.090815 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fdb18d1a-6b47-4b81-808a-f6458470a201-audit-dir\") pod \"apiserver-7bbb656c7d-ccqh7\" (UID: \"fdb18d1a-6b47-4b81-808a-f6458470a201\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccqh7" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.090844 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88f93b49-a038-467e-aebb-eecd7b9f307c-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-q42ms\" (UID: \"88f93b49-a038-467e-aebb-eecd7b9f307c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q42ms" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.090867 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3b9cef1-7930-44bf-9bc7-5e28f8282e4e-service-ca-bundle\") pod \"router-default-5444994796-jsq6c\" (UID: \"f3b9cef1-7930-44bf-9bc7-5e28f8282e4e\") " pod="openshift-ingress/router-default-5444994796-jsq6c" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.091171 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d737ed6-90b8-4607-bf34-a21992a704e6-serving-cert\") pod \"etcd-operator-b45778765-rxzmm\" (UID: \"7d737ed6-90b8-4607-bf34-a21992a704e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rxzmm" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.091215 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/0480be6e-a859-4bd7-8aad-0a7e5bf06a0e-image-import-ca\") pod \"apiserver-76f77b778f-hrbkp\" (UID: \"0480be6e-a859-4bd7-8aad-0a7e5bf06a0e\") " pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.091321 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/89afffa7-80af-4d36-9f60-c79ad00c737f-trusted-ca\") pod \"console-operator-58897d9998-lhljv\" (UID: \"89afffa7-80af-4d36-9f60-c79ad00c737f\") " pod="openshift-console-operator/console-operator-58897d9998-lhljv" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.091379 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmmm7\" (UniqueName: \"kubernetes.io/projected/fdb18d1a-6b47-4b81-808a-f6458470a201-kube-api-access-hmmm7\") pod \"apiserver-7bbb656c7d-ccqh7\" (UID: \"fdb18d1a-6b47-4b81-808a-f6458470a201\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccqh7" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.091629 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8dc5b308-08ce-4729-b854-d91947b6fce5-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-x5tb9\" (UID: \"8dc5b308-08ce-4729-b854-d91947b6fce5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x5tb9" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.091924 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0480be6e-a859-4bd7-8aad-0a7e5bf06a0e-etcd-client\") pod \"apiserver-76f77b778f-hrbkp\" (UID: \"0480be6e-a859-4bd7-8aad-0a7e5bf06a0e\") " pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.091929 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fdb18d1a-6b47-4b81-808a-f6458470a201-audit-dir\") pod \"apiserver-7bbb656c7d-ccqh7\" (UID: \"fdb18d1a-6b47-4b81-808a-f6458470a201\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccqh7" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.091948 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3b9cef1-7930-44bf-9bc7-5e28f8282e4e-metrics-certs\") pod \"router-default-5444994796-jsq6c\" (UID: \"f3b9cef1-7930-44bf-9bc7-5e28f8282e4e\") " pod="openshift-ingress/router-default-5444994796-jsq6c" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.091976 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a4eceb80-1269-438b-ad35-1a125e8b98c9-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4r7b2\" (UID: \"a4eceb80-1269-438b-ad35-1a125e8b98c9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4r7b2" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.093375 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.093440 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwfwc\" (UniqueName: \"kubernetes.io/projected/89afffa7-80af-4d36-9f60-c79ad00c737f-kube-api-access-wwfwc\") pod \"console-operator-58897d9998-lhljv\" (UID: \"89afffa7-80af-4d36-9f60-c79ad00c737f\") " pod="openshift-console-operator/console-operator-58897d9998-lhljv" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.093500 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0480be6e-a859-4bd7-8aad-0a7e5bf06a0e-encryption-config\") pod \"apiserver-76f77b778f-hrbkp\" (UID: \"0480be6e-a859-4bd7-8aad-0a7e5bf06a0e\") " pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.093525 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5kpc\" (UniqueName: \"kubernetes.io/projected/88f93b49-a038-467e-aebb-eecd7b9f307c-kube-api-access-l5kpc\") pod \"openshift-apiserver-operator-796bbdcf4f-q42ms\" (UID: \"88f93b49-a038-467e-aebb-eecd7b9f307c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q42ms" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.093599 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a4eceb80-1269-438b-ad35-1a125e8b98c9-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4r7b2\" (UID: \"a4eceb80-1269-438b-ad35-1a125e8b98c9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4r7b2" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.093645 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52c3bbf7-f787-4b3f-8028-cdee09aba43e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mnfsk\" (UID: \"52c3bbf7-f787-4b3f-8028-cdee09aba43e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mnfsk" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.094089 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/0480be6e-a859-4bd7-8aad-0a7e5bf06a0e-image-import-ca\") pod \"apiserver-76f77b778f-hrbkp\" (UID: \"0480be6e-a859-4bd7-8aad-0a7e5bf06a0e\") " pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.094460 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9phvv\" (UniqueName: \"kubernetes.io/projected/80cf94de-a056-4243-9ade-775eea192f3f-kube-api-access-9phvv\") pod \"authentication-operator-69f744f599-dvlkw\" (UID: \"80cf94de-a056-4243-9ade-775eea192f3f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dvlkw" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.094616 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nknrx\" (UniqueName: \"kubernetes.io/projected/127ebcb0-f31e-4857-9a67-842057dd7df4-kube-api-access-nknrx\") pod \"dns-operator-744455d44c-krm5q\" (UID: \"127ebcb0-f31e-4857-9a67-842057dd7df4\") " pod="openshift-dns-operator/dns-operator-744455d44c-krm5q" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.095564 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.095639 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/621fbcf2-089a-44b7-8130-e4f188d4b03f-serving-cert\") pod \"route-controller-manager-6576b87f9c-x4vxw\" (UID: \"621fbcf2-089a-44b7-8130-e4f188d4b03f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4vxw" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.095816 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l284\" (UniqueName: \"kubernetes.io/projected/a4eceb80-1269-438b-ad35-1a125e8b98c9-kube-api-access-9l284\") pod \"cluster-image-registry-operator-dc59b4c8b-4r7b2\" (UID: \"a4eceb80-1269-438b-ad35-1a125e8b98c9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4r7b2" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.097797 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnwwb\" (UniqueName: \"kubernetes.io/projected/582c125f-cc05-442d-9bc0-0e588b1dc998-kube-api-access-pnwwb\") pod \"openshift-config-operator-7777fb866f-fb9f6\" (UID: \"582c125f-cc05-442d-9bc0-0e588b1dc998\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fb9f6" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.098085 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fdb18d1a-6b47-4b81-808a-f6458470a201-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ccqh7\" (UID: \"fdb18d1a-6b47-4b81-808a-f6458470a201\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccqh7" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.098345 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b-client-ca\") pod \"controller-manager-879f6c89f-tbqcp\" (UID: \"bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tbqcp" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.098368 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b-serving-cert\") pod \"controller-manager-879f6c89f-tbqcp\" (UID: \"bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tbqcp" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.098421 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2d1dabd3-4307-468d-86d9-01a1ac2e3539-console-oauth-config\") pod \"console-f9d7485db-zj6td\" (UID: \"2d1dabd3-4307-468d-86d9-01a1ac2e3539\") " pod="openshift-console/console-f9d7485db-zj6td" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.098443 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/582c125f-cc05-442d-9bc0-0e588b1dc998-serving-cert\") pod \"openshift-config-operator-7777fb866f-fb9f6\" (UID: \"582c125f-cc05-442d-9bc0-0e588b1dc998\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fb9f6" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.098485 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/582c125f-cc05-442d-9bc0-0e588b1dc998-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fb9f6\" (UID: \"582c125f-cc05-442d-9bc0-0e588b1dc998\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fb9f6" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.098536 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.098610 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2d1dabd3-4307-468d-86d9-01a1ac2e3539-oauth-serving-cert\") pod \"console-f9d7485db-zj6td\" (UID: \"2d1dabd3-4307-468d-86d9-01a1ac2e3539\") " pod="openshift-console/console-f9d7485db-zj6td" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.098658 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.098681 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/7d737ed6-90b8-4607-bf34-a21992a704e6-etcd-ca\") pod \"etcd-operator-b45778765-rxzmm\" (UID: \"7d737ed6-90b8-4607-bf34-a21992a704e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rxzmm" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.098768 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0480be6e-a859-4bd7-8aad-0a7e5bf06a0e-audit\") pod \"apiserver-76f77b778f-hrbkp\" (UID: \"0480be6e-a859-4bd7-8aad-0a7e5bf06a0e\") " pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.098811 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80cf94de-a056-4243-9ade-775eea192f3f-serving-cert\") pod \"authentication-operator-69f744f599-dvlkw\" (UID: \"80cf94de-a056-4243-9ade-775eea192f3f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dvlkw" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.099033 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b-config\") pod \"controller-manager-879f6c89f-tbqcp\" (UID: \"bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tbqcp" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.099073 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24vcz\" (UniqueName: \"kubernetes.io/projected/52c3bbf7-f787-4b3f-8028-cdee09aba43e-kube-api-access-24vcz\") pod \"openshift-controller-manager-operator-756b6f6bc6-mnfsk\" (UID: \"52c3bbf7-f787-4b3f-8028-cdee09aba43e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mnfsk" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.099106 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80cf94de-a056-4243-9ade-775eea192f3f-config\") pod \"authentication-operator-69f744f599-dvlkw\" (UID: \"80cf94de-a056-4243-9ade-775eea192f3f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dvlkw" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.099131 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bfde4c91-4485-402b-aef5-2ffc738ba52d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lfrdd\" (UID: \"bfde4c91-4485-402b-aef5-2ffc738ba52d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lfrdd" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.099168 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/72a3daf3-ca59-4211-9195-1b5c70e4de7c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-tjktn\" (UID: \"72a3daf3-ca59-4211-9195-1b5c70e4de7c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tjktn" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.099654 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fdb18d1a-6b47-4b81-808a-f6458470a201-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ccqh7\" (UID: \"fdb18d1a-6b47-4b81-808a-f6458470a201\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccqh7" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.099723 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/fdb18d1a-6b47-4b81-808a-f6458470a201-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ccqh7\" (UID: \"fdb18d1a-6b47-4b81-808a-f6458470a201\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccqh7" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.099769 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0480be6e-a859-4bd7-8aad-0a7e5bf06a0e-serving-cert\") pod \"apiserver-76f77b778f-hrbkp\" (UID: \"0480be6e-a859-4bd7-8aad-0a7e5bf06a0e\") " pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.099796 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89afffa7-80af-4d36-9f60-c79ad00c737f-config\") pod \"console-operator-58897d9998-lhljv\" (UID: \"89afffa7-80af-4d36-9f60-c79ad00c737f\") " pod="openshift-console-operator/console-operator-58897d9998-lhljv" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.099816 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88f93b49-a038-467e-aebb-eecd7b9f307c-config\") pod \"openshift-apiserver-operator-796bbdcf4f-q42ms\" (UID: \"88f93b49-a038-467e-aebb-eecd7b9f307c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q42ms" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.099839 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.100434 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0480be6e-a859-4bd7-8aad-0a7e5bf06a0e-encryption-config\") pod \"apiserver-76f77b778f-hrbkp\" (UID: \"0480be6e-a859-4bd7-8aad-0a7e5bf06a0e\") " pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.101165 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0480be6e-a859-4bd7-8aad-0a7e5bf06a0e-etcd-client\") pod \"apiserver-76f77b778f-hrbkp\" (UID: \"0480be6e-a859-4bd7-8aad-0a7e5bf06a0e\") " pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.101739 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88f93b49-a038-467e-aebb-eecd7b9f307c-config\") pod \"openshift-apiserver-operator-796bbdcf4f-q42ms\" (UID: \"88f93b49-a038-467e-aebb-eecd7b9f307c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q42ms" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.101827 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fdb18d1a-6b47-4b81-808a-f6458470a201-audit-policies\") pod \"apiserver-7bbb656c7d-ccqh7\" (UID: \"fdb18d1a-6b47-4b81-808a-f6458470a201\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccqh7" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.101890 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdb18d1a-6b47-4b81-808a-f6458470a201-serving-cert\") pod \"apiserver-7bbb656c7d-ccqh7\" (UID: \"fdb18d1a-6b47-4b81-808a-f6458470a201\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccqh7" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.101946 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lmpr\" (UniqueName: \"kubernetes.io/projected/02863d54-8b48-4358-8dfe-b43269b1da31-kube-api-access-2lmpr\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.101980 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj7nt\" (UniqueName: \"kubernetes.io/projected/0480be6e-a859-4bd7-8aad-0a7e5bf06a0e-kube-api-access-qj7nt\") pod \"apiserver-76f77b778f-hrbkp\" (UID: \"0480be6e-a859-4bd7-8aad-0a7e5bf06a0e\") " pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.102024 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f3b9cef1-7930-44bf-9bc7-5e28f8282e4e-default-certificate\") pod \"router-default-5444994796-jsq6c\" (UID: \"f3b9cef1-7930-44bf-9bc7-5e28f8282e4e\") " pod="openshift-ingress/router-default-5444994796-jsq6c" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.102289 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-4psjf"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.102872 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fdb18d1a-6b47-4b81-808a-f6458470a201-audit-policies\") pod \"apiserver-7bbb656c7d-ccqh7\" (UID: \"fdb18d1a-6b47-4b81-808a-f6458470a201\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccqh7" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.103357 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0480be6e-a859-4bd7-8aad-0a7e5bf06a0e-audit-dir\") pod \"apiserver-76f77b778f-hrbkp\" (UID: \"0480be6e-a859-4bd7-8aad-0a7e5bf06a0e\") " pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.103509 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/621fbcf2-089a-44b7-8130-e4f188d4b03f-serving-cert\") pod \"route-controller-manager-6576b87f9c-x4vxw\" (UID: \"621fbcf2-089a-44b7-8130-e4f188d4b03f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4vxw" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.103553 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/faf810e6-431a-40f6-b5fd-83b74fffc701-auth-proxy-config\") pod \"machine-approver-56656f9798-g96w7\" (UID: \"faf810e6-431a-40f6-b5fd-83b74fffc701\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g96w7" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.103632 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0480be6e-a859-4bd7-8aad-0a7e5bf06a0e-audit-dir\") pod \"apiserver-76f77b778f-hrbkp\" (UID: \"0480be6e-a859-4bd7-8aad-0a7e5bf06a0e\") " pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.103687 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6k98\" (UniqueName: \"kubernetes.io/projected/72a3daf3-ca59-4211-9195-1b5c70e4de7c-kube-api-access-j6k98\") pod \"machine-api-operator-5694c8668f-tjktn\" (UID: \"72a3daf3-ca59-4211-9195-1b5c70e4de7c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tjktn" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.103787 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/127ebcb0-f31e-4857-9a67-842057dd7df4-metrics-tls\") pod \"dns-operator-744455d44c-krm5q\" (UID: \"127ebcb0-f31e-4857-9a67-842057dd7df4\") " pod="openshift-dns-operator/dns-operator-744455d44c-krm5q" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.103851 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89afffa7-80af-4d36-9f60-c79ad00c737f-serving-cert\") pod \"console-operator-58897d9998-lhljv\" (UID: \"89afffa7-80af-4d36-9f60-c79ad00c737f\") " pod="openshift-console-operator/console-operator-58897d9998-lhljv" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.103981 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/621fbcf2-089a-44b7-8130-e4f188d4b03f-client-ca\") pod \"route-controller-manager-6576b87f9c-x4vxw\" (UID: \"621fbcf2-089a-44b7-8130-e4f188d4b03f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4vxw" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.103992 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m24gb"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.104077 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0480be6e-a859-4bd7-8aad-0a7e5bf06a0e-node-pullsecrets\") pod \"apiserver-76f77b778f-hrbkp\" (UID: \"0480be6e-a859-4bd7-8aad-0a7e5bf06a0e\") " pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.104107 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2d1dabd3-4307-468d-86d9-01a1ac2e3539-service-ca\") pod \"console-f9d7485db-zj6td\" (UID: \"2d1dabd3-4307-468d-86d9-01a1ac2e3539\") " pod="openshift-console/console-f9d7485db-zj6td" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.104971 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m24gb" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.105096 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4psjf" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.105037 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0480be6e-a859-4bd7-8aad-0a7e5bf06a0e-etcd-serving-ca\") pod \"apiserver-76f77b778f-hrbkp\" (UID: \"0480be6e-a859-4bd7-8aad-0a7e5bf06a0e\") " pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.105553 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.105615 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw7zg\" (UniqueName: \"kubernetes.io/projected/8dc5b308-08ce-4729-b854-d91947b6fce5-kube-api-access-qw7zg\") pod \"cluster-samples-operator-665b6dd947-x5tb9\" (UID: \"8dc5b308-08ce-4729-b854-d91947b6fce5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x5tb9" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.105629 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/fdb18d1a-6b47-4b81-808a-f6458470a201-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ccqh7\" (UID: \"fdb18d1a-6b47-4b81-808a-f6458470a201\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccqh7" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.105656 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d737ed6-90b8-4607-bf34-a21992a704e6-config\") pod \"etcd-operator-b45778765-rxzmm\" (UID: \"7d737ed6-90b8-4607-bf34-a21992a704e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rxzmm" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.105732 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fdb18d1a-6b47-4b81-808a-f6458470a201-etcd-client\") pod \"apiserver-7bbb656c7d-ccqh7\" (UID: \"fdb18d1a-6b47-4b81-808a-f6458470a201\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccqh7" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.105797 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/fdb18d1a-6b47-4b81-808a-f6458470a201-encryption-config\") pod \"apiserver-7bbb656c7d-ccqh7\" (UID: \"fdb18d1a-6b47-4b81-808a-f6458470a201\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccqh7" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.105831 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfde4c91-4485-402b-aef5-2ffc738ba52d-config\") pod \"kube-controller-manager-operator-78b949d7b-lfrdd\" (UID: \"bfde4c91-4485-402b-aef5-2ffc738ba52d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lfrdd" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.105926 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/02863d54-8b48-4358-8dfe-b43269b1da31-audit-policies\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.106204 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7d737ed6-90b8-4607-bf34-a21992a704e6-etcd-client\") pod \"etcd-operator-b45778765-rxzmm\" (UID: \"7d737ed6-90b8-4607-bf34-a21992a704e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rxzmm" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.106243 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d1dabd3-4307-468d-86d9-01a1ac2e3539-trusted-ca-bundle\") pod \"console-f9d7485db-zj6td\" (UID: \"2d1dabd3-4307-468d-86d9-01a1ac2e3539\") " pod="openshift-console/console-f9d7485db-zj6td" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.106337 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80cf94de-a056-4243-9ade-775eea192f3f-service-ca-bundle\") pod \"authentication-operator-69f744f599-dvlkw\" (UID: \"80cf94de-a056-4243-9ade-775eea192f3f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dvlkw" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.106377 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/faf810e6-431a-40f6-b5fd-83b74fffc701-config\") pod \"machine-approver-56656f9798-g96w7\" (UID: \"faf810e6-431a-40f6-b5fd-83b74fffc701\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g96w7" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.106416 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-tbqcp\" (UID: \"bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tbqcp" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.106442 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfde4c91-4485-402b-aef5-2ffc738ba52d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lfrdd\" (UID: \"bfde4c91-4485-402b-aef5-2ffc738ba52d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lfrdd" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.106715 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdhhf\" (UniqueName: \"kubernetes.io/projected/621fbcf2-089a-44b7-8130-e4f188d4b03f-kube-api-access-kdhhf\") pod \"route-controller-manager-6576b87f9c-x4vxw\" (UID: \"621fbcf2-089a-44b7-8130-e4f188d4b03f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4vxw" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.106865 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80cf94de-a056-4243-9ade-775eea192f3f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dvlkw\" (UID: \"80cf94de-a056-4243-9ade-775eea192f3f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dvlkw" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.106988 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f3b9cef1-7930-44bf-9bc7-5e28f8282e4e-stats-auth\") pod \"router-default-5444994796-jsq6c\" (UID: \"f3b9cef1-7930-44bf-9bc7-5e28f8282e4e\") " pod="openshift-ingress/router-default-5444994796-jsq6c" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.107087 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpd79\" (UniqueName: \"kubernetes.io/projected/f3b9cef1-7930-44bf-9bc7-5e28f8282e4e-kube-api-access-gpd79\") pod \"router-default-5444994796-jsq6c\" (UID: \"f3b9cef1-7930-44bf-9bc7-5e28f8282e4e\") " pod="openshift-ingress/router-default-5444994796-jsq6c" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.107303 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72a3daf3-ca59-4211-9195-1b5c70e4de7c-config\") pod \"machine-api-operator-5694c8668f-tjktn\" (UID: \"72a3daf3-ca59-4211-9195-1b5c70e4de7c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tjktn" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.107332 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/72a3daf3-ca59-4211-9195-1b5c70e4de7c-images\") pod \"machine-api-operator-5694c8668f-tjktn\" (UID: \"72a3daf3-ca59-4211-9195-1b5c70e4de7c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tjktn" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.107358 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.107398 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.107452 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/621fbcf2-089a-44b7-8130-e4f188d4b03f-config\") pod \"route-controller-manager-6576b87f9c-x4vxw\" (UID: \"621fbcf2-089a-44b7-8130-e4f188d4b03f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4vxw" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.107483 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0480be6e-a859-4bd7-8aad-0a7e5bf06a0e-config\") pod \"apiserver-76f77b778f-hrbkp\" (UID: \"0480be6e-a859-4bd7-8aad-0a7e5bf06a0e\") " pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.109450 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdb18d1a-6b47-4b81-808a-f6458470a201-serving-cert\") pod \"apiserver-7bbb656c7d-ccqh7\" (UID: \"fdb18d1a-6b47-4b81-808a-f6458470a201\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccqh7" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.109567 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/621fbcf2-089a-44b7-8130-e4f188d4b03f-client-ca\") pod \"route-controller-manager-6576b87f9c-x4vxw\" (UID: \"621fbcf2-089a-44b7-8130-e4f188d4b03f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4vxw" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.109739 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/621fbcf2-089a-44b7-8130-e4f188d4b03f-config\") pod \"route-controller-manager-6576b87f9c-x4vxw\" (UID: \"621fbcf2-089a-44b7-8130-e4f188d4b03f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4vxw" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.110047 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/fdb18d1a-6b47-4b81-808a-f6458470a201-encryption-config\") pod \"apiserver-7bbb656c7d-ccqh7\" (UID: \"fdb18d1a-6b47-4b81-808a-f6458470a201\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccqh7" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.110643 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fdb18d1a-6b47-4b81-808a-f6458470a201-etcd-client\") pod \"apiserver-7bbb656c7d-ccqh7\" (UID: \"fdb18d1a-6b47-4b81-808a-f6458470a201\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccqh7" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.111730 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.112635 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.113028 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.113397 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0480be6e-a859-4bd7-8aad-0a7e5bf06a0e-etcd-serving-ca\") pod \"apiserver-76f77b778f-hrbkp\" (UID: \"0480be6e-a859-4bd7-8aad-0a7e5bf06a0e\") " pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.113428 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0480be6e-a859-4bd7-8aad-0a7e5bf06a0e-config\") pod \"apiserver-76f77b778f-hrbkp\" (UID: \"0480be6e-a859-4bd7-8aad-0a7e5bf06a0e\") " pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.117683 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0480be6e-a859-4bd7-8aad-0a7e5bf06a0e-serving-cert\") pod \"apiserver-76f77b778f-hrbkp\" (UID: \"0480be6e-a859-4bd7-8aad-0a7e5bf06a0e\") " pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.122970 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0480be6e-a859-4bd7-8aad-0a7e5bf06a0e-audit\") pod \"apiserver-76f77b778f-hrbkp\" (UID: \"0480be6e-a859-4bd7-8aad-0a7e5bf06a0e\") " pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.123282 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.123570 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.123650 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0480be6e-a859-4bd7-8aad-0a7e5bf06a0e-node-pullsecrets\") pod \"apiserver-76f77b778f-hrbkp\" (UID: \"0480be6e-a859-4bd7-8aad-0a7e5bf06a0e\") " pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.124937 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.125207 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.125287 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.125402 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.125480 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.125545 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.125680 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.126867 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.127109 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.130672 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.130846 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.131009 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.131383 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ccqh7"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.131531 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88f93b49-a038-467e-aebb-eecd7b9f307c-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-q42ms\" (UID: \"88f93b49-a038-467e-aebb-eecd7b9f307c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q42ms" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.132293 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.132992 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.133696 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.137868 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.138119 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.138160 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.138522 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.138776 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.139408 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.139443 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.139762 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.139886 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.138780 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.140185 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.141318 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.141792 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.142678 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.143807 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0480be6e-a859-4bd7-8aad-0a7e5bf06a0e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-hrbkp\" (UID: \"0480be6e-a859-4bd7-8aad-0a7e5bf06a0e\") " pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.143802 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.143995 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.145122 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.147772 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.152954 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-hrbkp"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.153014 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-d5rp7"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.153705 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.157312 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.157927 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.178274 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-s8s94"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.179055 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fcc4l"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.179640 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dmrgx"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.180472 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q42ms"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.180596 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dmrgx" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.181064 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-d5rp7" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.182749 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.184415 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.185578 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s8s94" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.186396 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fcc4l" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.189205 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.198332 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.200475 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.200798 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vgtn"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.203045 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.204263 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5vcfl"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.204431 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vgtn" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.205419 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-5vcfl" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.209058 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3b9cef1-7930-44bf-9bc7-5e28f8282e4e-service-ca-bundle\") pod \"router-default-5444994796-jsq6c\" (UID: \"f3b9cef1-7930-44bf-9bc7-5e28f8282e4e\") " pod="openshift-ingress/router-default-5444994796-jsq6c" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.209179 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/faf810e6-431a-40f6-b5fd-83b74fffc701-machine-approver-tls\") pod \"machine-approver-56656f9798-g96w7\" (UID: \"faf810e6-431a-40f6-b5fd-83b74fffc701\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g96w7" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.209203 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxprw\" (UniqueName: \"kubernetes.io/projected/faf810e6-431a-40f6-b5fd-83b74fffc701-kube-api-access-xxprw\") pod \"machine-approver-56656f9798-g96w7\" (UID: \"faf810e6-431a-40f6-b5fd-83b74fffc701\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g96w7" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.209232 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d737ed6-90b8-4607-bf34-a21992a704e6-serving-cert\") pod \"etcd-operator-b45778765-rxzmm\" (UID: \"7d737ed6-90b8-4607-bf34-a21992a704e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rxzmm" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.209321 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/89afffa7-80af-4d36-9f60-c79ad00c737f-trusted-ca\") pod \"console-operator-58897d9998-lhljv\" (UID: \"89afffa7-80af-4d36-9f60-c79ad00c737f\") " pod="openshift-console-operator/console-operator-58897d9998-lhljv" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.209427 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8dc5b308-08ce-4729-b854-d91947b6fce5-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-x5tb9\" (UID: \"8dc5b308-08ce-4729-b854-d91947b6fce5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x5tb9" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.210012 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3b9cef1-7930-44bf-9bc7-5e28f8282e4e-metrics-certs\") pod \"router-default-5444994796-jsq6c\" (UID: \"f3b9cef1-7930-44bf-9bc7-5e28f8282e4e\") " pod="openshift-ingress/router-default-5444994796-jsq6c" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.210034 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a4eceb80-1269-438b-ad35-1a125e8b98c9-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4r7b2\" (UID: \"a4eceb80-1269-438b-ad35-1a125e8b98c9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4r7b2" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.210056 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.210092 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a4eceb80-1269-438b-ad35-1a125e8b98c9-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4r7b2\" (UID: \"a4eceb80-1269-438b-ad35-1a125e8b98c9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4r7b2" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.210166 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwfwc\" (UniqueName: \"kubernetes.io/projected/89afffa7-80af-4d36-9f60-c79ad00c737f-kube-api-access-wwfwc\") pod \"console-operator-58897d9998-lhljv\" (UID: \"89afffa7-80af-4d36-9f60-c79ad00c737f\") " pod="openshift-console-operator/console-operator-58897d9998-lhljv" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.210188 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52c3bbf7-f787-4b3f-8028-cdee09aba43e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mnfsk\" (UID: \"52c3bbf7-f787-4b3f-8028-cdee09aba43e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mnfsk" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.210210 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9phvv\" (UniqueName: \"kubernetes.io/projected/80cf94de-a056-4243-9ade-775eea192f3f-kube-api-access-9phvv\") pod \"authentication-operator-69f744f599-dvlkw\" (UID: \"80cf94de-a056-4243-9ade-775eea192f3f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dvlkw" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.210227 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nknrx\" (UniqueName: \"kubernetes.io/projected/127ebcb0-f31e-4857-9a67-842057dd7df4-kube-api-access-nknrx\") pod \"dns-operator-744455d44c-krm5q\" (UID: \"127ebcb0-f31e-4857-9a67-842057dd7df4\") " pod="openshift-dns-operator/dns-operator-744455d44c-krm5q" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.210250 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.210274 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l284\" (UniqueName: \"kubernetes.io/projected/a4eceb80-1269-438b-ad35-1a125e8b98c9-kube-api-access-9l284\") pod \"cluster-image-registry-operator-dc59b4c8b-4r7b2\" (UID: \"a4eceb80-1269-438b-ad35-1a125e8b98c9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4r7b2" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.210301 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b-client-ca\") pod \"controller-manager-879f6c89f-tbqcp\" (UID: \"bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tbqcp" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.210356 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b-serving-cert\") pod \"controller-manager-879f6c89f-tbqcp\" (UID: \"bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tbqcp" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.210376 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2d1dabd3-4307-468d-86d9-01a1ac2e3539-console-oauth-config\") pod \"console-f9d7485db-zj6td\" (UID: \"2d1dabd3-4307-468d-86d9-01a1ac2e3539\") " pod="openshift-console/console-f9d7485db-zj6td" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.210395 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnwwb\" (UniqueName: \"kubernetes.io/projected/582c125f-cc05-442d-9bc0-0e588b1dc998-kube-api-access-pnwwb\") pod \"openshift-config-operator-7777fb866f-fb9f6\" (UID: \"582c125f-cc05-442d-9bc0-0e588b1dc998\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fb9f6" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.210418 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.210436 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2d1dabd3-4307-468d-86d9-01a1ac2e3539-oauth-serving-cert\") pod \"console-f9d7485db-zj6td\" (UID: \"2d1dabd3-4307-468d-86d9-01a1ac2e3539\") " pod="openshift-console/console-f9d7485db-zj6td" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.210454 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/582c125f-cc05-442d-9bc0-0e588b1dc998-serving-cert\") pod \"openshift-config-operator-7777fb866f-fb9f6\" (UID: \"582c125f-cc05-442d-9bc0-0e588b1dc998\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fb9f6" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.210471 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/582c125f-cc05-442d-9bc0-0e588b1dc998-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fb9f6\" (UID: \"582c125f-cc05-442d-9bc0-0e588b1dc998\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fb9f6" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.210498 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80cf94de-a056-4243-9ade-775eea192f3f-serving-cert\") pod \"authentication-operator-69f744f599-dvlkw\" (UID: \"80cf94de-a056-4243-9ade-775eea192f3f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dvlkw" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.210514 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b-config\") pod \"controller-manager-879f6c89f-tbqcp\" (UID: \"bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tbqcp" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.210532 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24vcz\" (UniqueName: \"kubernetes.io/projected/52c3bbf7-f787-4b3f-8028-cdee09aba43e-kube-api-access-24vcz\") pod \"openshift-controller-manager-operator-756b6f6bc6-mnfsk\" (UID: \"52c3bbf7-f787-4b3f-8028-cdee09aba43e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mnfsk" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.210548 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.210569 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/7d737ed6-90b8-4607-bf34-a21992a704e6-etcd-ca\") pod \"etcd-operator-b45778765-rxzmm\" (UID: \"7d737ed6-90b8-4607-bf34-a21992a704e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rxzmm" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.210586 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80cf94de-a056-4243-9ade-775eea192f3f-config\") pod \"authentication-operator-69f744f599-dvlkw\" (UID: \"80cf94de-a056-4243-9ade-775eea192f3f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dvlkw" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.210606 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bfde4c91-4485-402b-aef5-2ffc738ba52d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lfrdd\" (UID: \"bfde4c91-4485-402b-aef5-2ffc738ba52d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lfrdd" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.210640 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/72a3daf3-ca59-4211-9195-1b5c70e4de7c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-tjktn\" (UID: \"72a3daf3-ca59-4211-9195-1b5c70e4de7c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tjktn" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.210658 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89afffa7-80af-4d36-9f60-c79ad00c737f-config\") pod \"console-operator-58897d9998-lhljv\" (UID: \"89afffa7-80af-4d36-9f60-c79ad00c737f\") " pod="openshift-console-operator/console-operator-58897d9998-lhljv" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.210675 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.210692 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lmpr\" (UniqueName: \"kubernetes.io/projected/02863d54-8b48-4358-8dfe-b43269b1da31-kube-api-access-2lmpr\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.210723 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f3b9cef1-7930-44bf-9bc7-5e28f8282e4e-default-certificate\") pod \"router-default-5444994796-jsq6c\" (UID: \"f3b9cef1-7930-44bf-9bc7-5e28f8282e4e\") " pod="openshift-ingress/router-default-5444994796-jsq6c" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.210743 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/faf810e6-431a-40f6-b5fd-83b74fffc701-auth-proxy-config\") pod \"machine-approver-56656f9798-g96w7\" (UID: \"faf810e6-431a-40f6-b5fd-83b74fffc701\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g96w7" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.210760 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6k98\" (UniqueName: \"kubernetes.io/projected/72a3daf3-ca59-4211-9195-1b5c70e4de7c-kube-api-access-j6k98\") pod \"machine-api-operator-5694c8668f-tjktn\" (UID: \"72a3daf3-ca59-4211-9195-1b5c70e4de7c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tjktn" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.210781 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/127ebcb0-f31e-4857-9a67-842057dd7df4-metrics-tls\") pod \"dns-operator-744455d44c-krm5q\" (UID: \"127ebcb0-f31e-4857-9a67-842057dd7df4\") " pod="openshift-dns-operator/dns-operator-744455d44c-krm5q" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.210798 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89afffa7-80af-4d36-9f60-c79ad00c737f-serving-cert\") pod \"console-operator-58897d9998-lhljv\" (UID: \"89afffa7-80af-4d36-9f60-c79ad00c737f\") " pod="openshift-console-operator/console-operator-58897d9998-lhljv" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.210818 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2d1dabd3-4307-468d-86d9-01a1ac2e3539-service-ca\") pod \"console-f9d7485db-zj6td\" (UID: \"2d1dabd3-4307-468d-86d9-01a1ac2e3539\") " pod="openshift-console/console-f9d7485db-zj6td" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.210837 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.210855 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qw7zg\" (UniqueName: \"kubernetes.io/projected/8dc5b308-08ce-4729-b854-d91947b6fce5-kube-api-access-qw7zg\") pod \"cluster-samples-operator-665b6dd947-x5tb9\" (UID: \"8dc5b308-08ce-4729-b854-d91947b6fce5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x5tb9" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.210900 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d737ed6-90b8-4607-bf34-a21992a704e6-config\") pod \"etcd-operator-b45778765-rxzmm\" (UID: \"7d737ed6-90b8-4607-bf34-a21992a704e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rxzmm" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.210939 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfde4c91-4485-402b-aef5-2ffc738ba52d-config\") pod \"kube-controller-manager-operator-78b949d7b-lfrdd\" (UID: \"bfde4c91-4485-402b-aef5-2ffc738ba52d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lfrdd" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.211025 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/02863d54-8b48-4358-8dfe-b43269b1da31-audit-policies\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.211057 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7d737ed6-90b8-4607-bf34-a21992a704e6-etcd-client\") pod \"etcd-operator-b45778765-rxzmm\" (UID: \"7d737ed6-90b8-4607-bf34-a21992a704e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rxzmm" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.211080 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80cf94de-a056-4243-9ade-775eea192f3f-service-ca-bundle\") pod \"authentication-operator-69f744f599-dvlkw\" (UID: \"80cf94de-a056-4243-9ade-775eea192f3f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dvlkw" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.211099 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/faf810e6-431a-40f6-b5fd-83b74fffc701-config\") pod \"machine-approver-56656f9798-g96w7\" (UID: \"faf810e6-431a-40f6-b5fd-83b74fffc701\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g96w7" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.211117 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d1dabd3-4307-468d-86d9-01a1ac2e3539-trusted-ca-bundle\") pod \"console-f9d7485db-zj6td\" (UID: \"2d1dabd3-4307-468d-86d9-01a1ac2e3539\") " pod="openshift-console/console-f9d7485db-zj6td" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.211173 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80cf94de-a056-4243-9ade-775eea192f3f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dvlkw\" (UID: \"80cf94de-a056-4243-9ade-775eea192f3f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dvlkw" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.211195 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f3b9cef1-7930-44bf-9bc7-5e28f8282e4e-stats-auth\") pod \"router-default-5444994796-jsq6c\" (UID: \"f3b9cef1-7930-44bf-9bc7-5e28f8282e4e\") " pod="openshift-ingress/router-default-5444994796-jsq6c" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.211213 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpd79\" (UniqueName: \"kubernetes.io/projected/f3b9cef1-7930-44bf-9bc7-5e28f8282e4e-kube-api-access-gpd79\") pod \"router-default-5444994796-jsq6c\" (UID: \"f3b9cef1-7930-44bf-9bc7-5e28f8282e4e\") " pod="openshift-ingress/router-default-5444994796-jsq6c" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.211230 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-tbqcp\" (UID: \"bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tbqcp" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.211253 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfde4c91-4485-402b-aef5-2ffc738ba52d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lfrdd\" (UID: \"bfde4c91-4485-402b-aef5-2ffc738ba52d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lfrdd" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.211272 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72a3daf3-ca59-4211-9195-1b5c70e4de7c-config\") pod \"machine-api-operator-5694c8668f-tjktn\" (UID: \"72a3daf3-ca59-4211-9195-1b5c70e4de7c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tjktn" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.211293 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/72a3daf3-ca59-4211-9195-1b5c70e4de7c-images\") pod \"machine-api-operator-5694c8668f-tjktn\" (UID: \"72a3daf3-ca59-4211-9195-1b5c70e4de7c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tjktn" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.211314 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.211333 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.211351 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/02863d54-8b48-4358-8dfe-b43269b1da31-audit-dir\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.211371 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.211391 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.211418 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2d1dabd3-4307-468d-86d9-01a1ac2e3539-console-config\") pod \"console-f9d7485db-zj6td\" (UID: \"2d1dabd3-4307-468d-86d9-01a1ac2e3539\") " pod="openshift-console/console-f9d7485db-zj6td" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.211438 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/7d737ed6-90b8-4607-bf34-a21992a704e6-etcd-service-ca\") pod \"etcd-operator-b45778765-rxzmm\" (UID: \"7d737ed6-90b8-4607-bf34-a21992a704e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rxzmm" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.211461 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a4eceb80-1269-438b-ad35-1a125e8b98c9-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4r7b2\" (UID: \"a4eceb80-1269-438b-ad35-1a125e8b98c9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4r7b2" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.211484 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52c3bbf7-f787-4b3f-8028-cdee09aba43e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mnfsk\" (UID: \"52c3bbf7-f787-4b3f-8028-cdee09aba43e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mnfsk" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.211508 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.211526 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6vtk\" (UniqueName: \"kubernetes.io/projected/3083b21b-220e-4439-a3c1-18c79f073151-kube-api-access-q6vtk\") pod \"downloads-7954f5f757-cfp2v\" (UID: \"3083b21b-220e-4439-a3c1-18c79f073151\") " pod="openshift-console/downloads-7954f5f757-cfp2v" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.211544 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh79f\" (UniqueName: \"kubernetes.io/projected/7d737ed6-90b8-4607-bf34-a21992a704e6-kube-api-access-kh79f\") pod \"etcd-operator-b45778765-rxzmm\" (UID: \"7d737ed6-90b8-4607-bf34-a21992a704e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rxzmm" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.211563 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2d1dabd3-4307-468d-86d9-01a1ac2e3539-console-serving-cert\") pod \"console-f9d7485db-zj6td\" (UID: \"2d1dabd3-4307-468d-86d9-01a1ac2e3539\") " pod="openshift-console/console-f9d7485db-zj6td" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.211581 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjgqk\" (UniqueName: \"kubernetes.io/projected/bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b-kube-api-access-kjgqk\") pod \"controller-manager-879f6c89f-tbqcp\" (UID: \"bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tbqcp" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.211597 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2g4j\" (UniqueName: \"kubernetes.io/projected/2d1dabd3-4307-468d-86d9-01a1ac2e3539-kube-api-access-f2g4j\") pod \"console-f9d7485db-zj6td\" (UID: \"2d1dabd3-4307-468d-86d9-01a1ac2e3539\") " pod="openshift-console/console-f9d7485db-zj6td" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.212839 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536890-gjpc2"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.213246 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a4eceb80-1269-438b-ad35-1a125e8b98c9-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4r7b2\" (UID: \"a4eceb80-1269-438b-ad35-1a125e8b98c9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4r7b2" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.213576 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hbfsp"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.213822 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8dc5b308-08ce-4729-b854-d91947b6fce5-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-x5tb9\" (UID: \"8dc5b308-08ce-4729-b854-d91947b6fce5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x5tb9" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.214167 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vz9rd"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.214701 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536890-gjpc2" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.215410 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/582c125f-cc05-442d-9bc0-0e588b1dc998-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fb9f6\" (UID: \"582c125f-cc05-442d-9bc0-0e588b1dc998\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fb9f6" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.215657 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72a3daf3-ca59-4211-9195-1b5c70e4de7c-config\") pod \"machine-api-operator-5694c8668f-tjktn\" (UID: \"72a3daf3-ca59-4211-9195-1b5c70e4de7c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tjktn" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.215963 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/89afffa7-80af-4d36-9f60-c79ad00c737f-trusted-ca\") pod \"console-operator-58897d9998-lhljv\" (UID: \"89afffa7-80af-4d36-9f60-c79ad00c737f\") " pod="openshift-console-operator/console-operator-58897d9998-lhljv" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.215977 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80cf94de-a056-4243-9ade-775eea192f3f-service-ca-bundle\") pod \"authentication-operator-69f744f599-dvlkw\" (UID: \"80cf94de-a056-4243-9ade-775eea192f3f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dvlkw" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.216236 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d737ed6-90b8-4607-bf34-a21992a704e6-config\") pod \"etcd-operator-b45778765-rxzmm\" (UID: \"7d737ed6-90b8-4607-bf34-a21992a704e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rxzmm" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.216496 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89afffa7-80af-4d36-9f60-c79ad00c737f-serving-cert\") pod \"console-operator-58897d9998-lhljv\" (UID: \"89afffa7-80af-4d36-9f60-c79ad00c737f\") " pod="openshift-console-operator/console-operator-58897d9998-lhljv" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.216732 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/127ebcb0-f31e-4857-9a67-842057dd7df4-metrics-tls\") pod \"dns-operator-744455d44c-krm5q\" (UID: \"127ebcb0-f31e-4857-9a67-842057dd7df4\") " pod="openshift-dns-operator/dns-operator-744455d44c-krm5q" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.217496 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfde4c91-4485-402b-aef5-2ffc738ba52d-config\") pod \"kube-controller-manager-operator-78b949d7b-lfrdd\" (UID: \"bfde4c91-4485-402b-aef5-2ffc738ba52d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lfrdd" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.218024 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/72a3daf3-ca59-4211-9195-1b5c70e4de7c-images\") pod \"machine-api-operator-5694c8668f-tjktn\" (UID: \"72a3daf3-ca59-4211-9195-1b5c70e4de7c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tjktn" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.218054 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/582c125f-cc05-442d-9bc0-0e588b1dc998-serving-cert\") pod \"openshift-config-operator-7777fb866f-fb9f6\" (UID: \"582c125f-cc05-442d-9bc0-0e588b1dc998\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fb9f6" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.218996 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/faf810e6-431a-40f6-b5fd-83b74fffc701-config\") pod \"machine-approver-56656f9798-g96w7\" (UID: \"faf810e6-431a-40f6-b5fd-83b74fffc701\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g96w7" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.219121 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2d1dabd3-4307-468d-86d9-01a1ac2e3539-service-ca\") pod \"console-f9d7485db-zj6td\" (UID: \"2d1dabd3-4307-468d-86d9-01a1ac2e3539\") " pod="openshift-console/console-f9d7485db-zj6td" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.219727 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80cf94de-a056-4243-9ade-775eea192f3f-serving-cert\") pod \"authentication-operator-69f744f599-dvlkw\" (UID: \"80cf94de-a056-4243-9ade-775eea192f3f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dvlkw" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.220456 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d1dabd3-4307-468d-86d9-01a1ac2e3539-trusted-ca-bundle\") pod \"console-f9d7485db-zj6td\" (UID: \"2d1dabd3-4307-468d-86d9-01a1ac2e3539\") " pod="openshift-console/console-f9d7485db-zj6td" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.220838 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.221048 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536898-598km"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.221354 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-hbfsp" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.221442 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-lhljv"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.221460 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mnfsk"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.221556 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536898-598km" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.221789 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vz9rd" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.221043 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.222552 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b-config\") pod \"controller-manager-879f6c89f-tbqcp\" (UID: \"bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tbqcp" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.222961 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80cf94de-a056-4243-9ade-775eea192f3f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dvlkw\" (UID: \"80cf94de-a056-4243-9ade-775eea192f3f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dvlkw" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.223204 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52c3bbf7-f787-4b3f-8028-cdee09aba43e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mnfsk\" (UID: \"52c3bbf7-f787-4b3f-8028-cdee09aba43e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mnfsk" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.223891 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/02863d54-8b48-4358-8dfe-b43269b1da31-audit-policies\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.224211 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qxgv7"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.224231 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8r7pq"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.226064 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/02863d54-8b48-4358-8dfe-b43269b1da31-audit-dir\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.226115 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b-client-ca\") pod \"controller-manager-879f6c89f-tbqcp\" (UID: \"bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tbqcp" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.227253 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89afffa7-80af-4d36-9f60-c79ad00c737f-config\") pod \"console-operator-58897d9998-lhljv\" (UID: \"89afffa7-80af-4d36-9f60-c79ad00c737f\") " pod="openshift-console-operator/console-operator-58897d9998-lhljv" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.227724 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/faf810e6-431a-40f6-b5fd-83b74fffc701-auth-proxy-config\") pod \"machine-approver-56656f9798-g96w7\" (UID: \"faf810e6-431a-40f6-b5fd-83b74fffc701\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g96w7" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.228009 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.228044 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-krm5q"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.228063 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-dm4md"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.228874 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4r7b2"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.229035 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.229103 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dm4md" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.229369 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.229775 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2d1dabd3-4307-468d-86d9-01a1ac2e3539-console-oauth-config\") pod \"console-f9d7485db-zj6td\" (UID: \"2d1dabd3-4307-468d-86d9-01a1ac2e3539\") " pod="openshift-console/console-f9d7485db-zj6td" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.230431 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a4eceb80-1269-438b-ad35-1a125e8b98c9-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4r7b2\" (UID: \"a4eceb80-1269-438b-ad35-1a125e8b98c9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4r7b2" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.230444 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.230484 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80cf94de-a056-4243-9ade-775eea192f3f-config\") pod \"authentication-operator-69f744f599-dvlkw\" (UID: \"80cf94de-a056-4243-9ade-775eea192f3f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dvlkw" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.230499 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/7d737ed6-90b8-4607-bf34-a21992a704e6-etcd-ca\") pod \"etcd-operator-b45778765-rxzmm\" (UID: \"7d737ed6-90b8-4607-bf34-a21992a704e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rxzmm" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.230993 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2d1dabd3-4307-468d-86d9-01a1ac2e3539-oauth-serving-cert\") pod \"console-f9d7485db-zj6td\" (UID: \"2d1dabd3-4307-468d-86d9-01a1ac2e3539\") " pod="openshift-console/console-f9d7485db-zj6td" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.231304 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52c3bbf7-f787-4b3f-8028-cdee09aba43e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mnfsk\" (UID: \"52c3bbf7-f787-4b3f-8028-cdee09aba43e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mnfsk" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.231403 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.231600 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.231647 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2d1dabd3-4307-468d-86d9-01a1ac2e3539-console-config\") pod \"console-f9d7485db-zj6td\" (UID: \"2d1dabd3-4307-468d-86d9-01a1ac2e3539\") " pod="openshift-console/console-f9d7485db-zj6td" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.231996 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-tbqcp\" (UID: \"bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tbqcp" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.232062 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.232256 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dvlkw"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.233223 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x5tb9"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.235034 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b-serving-cert\") pod \"controller-manager-879f6c89f-tbqcp\" (UID: \"bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tbqcp" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.235408 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.235425 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-xl44c"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.236370 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.237122 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/72a3daf3-ca59-4211-9195-1b5c70e4de7c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-tjktn\" (UID: \"72a3daf3-ca59-4211-9195-1b5c70e4de7c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tjktn" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.237258 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-xl44c" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.238101 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/faf810e6-431a-40f6-b5fd-83b74fffc701-machine-approver-tls\") pod \"machine-approver-56656f9798-g96w7\" (UID: \"faf810e6-431a-40f6-b5fd-83b74fffc701\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g96w7" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.238401 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfde4c91-4485-402b-aef5-2ffc738ba52d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lfrdd\" (UID: \"bfde4c91-4485-402b-aef5-2ffc738ba52d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lfrdd" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.238777 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2d1dabd3-4307-468d-86d9-01a1ac2e3539-console-serving-cert\") pod \"console-f9d7485db-zj6td\" (UID: \"2d1dabd3-4307-468d-86d9-01a1ac2e3539\") " pod="openshift-console/console-f9d7485db-zj6td" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.239177 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.239579 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tbqcp"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.240632 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lps47"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.240821 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.241677 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-rxzmm"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.242733 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-cfp2v"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.243792 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pwkv"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.245412 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lfrdd"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.246610 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9s4n4"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.247793 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-tjktn"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.249228 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-zj6td"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.250401 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fcc4l"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.251522 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4vxw"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.253332 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-r47g5"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.253385 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d737ed6-90b8-4607-bf34-a21992a704e6-serving-cert\") pod \"etcd-operator-b45778765-rxzmm\" (UID: \"7d737ed6-90b8-4607-bf34-a21992a704e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rxzmm" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.255384 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-jqd62"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.256516 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.256615 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-d5rp7"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.257854 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vgtn"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.258994 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dm4md"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.260047 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536898-598km"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.261281 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fb9f6"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.262995 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-4psjf"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.264375 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lcgcn"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.265716 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-xl44c"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.267383 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7d737ed6-90b8-4607-bf34-a21992a704e6-etcd-client\") pod \"etcd-operator-b45778765-rxzmm\" (UID: \"7d737ed6-90b8-4607-bf34-a21992a704e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rxzmm" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.267672 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dmrgx"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.268930 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m24gb"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.270199 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5vcfl"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.271662 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-s8s94"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.272814 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vz9rd"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.274047 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-rjvxc"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.275037 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rjvxc" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.275173 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536890-gjpc2"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.276535 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.285963 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hbfsp"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.290076 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rjvxc"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.290486 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-kbhp4"] Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.291480 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-kbhp4" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.296573 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.299066 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/7d737ed6-90b8-4607-bf34-a21992a704e6-etcd-service-ca\") pod \"etcd-operator-b45778765-rxzmm\" (UID: \"7d737ed6-90b8-4607-bf34-a21992a704e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rxzmm" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.316028 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.335773 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.357181 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.375890 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.397069 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.404244 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f3b9cef1-7930-44bf-9bc7-5e28f8282e4e-default-certificate\") pod \"router-default-5444994796-jsq6c\" (UID: \"f3b9cef1-7930-44bf-9bc7-5e28f8282e4e\") " pod="openshift-ingress/router-default-5444994796-jsq6c" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.416937 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.431375 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f3b9cef1-7930-44bf-9bc7-5e28f8282e4e-stats-auth\") pod \"router-default-5444994796-jsq6c\" (UID: \"f3b9cef1-7930-44bf-9bc7-5e28f8282e4e\") " pod="openshift-ingress/router-default-5444994796-jsq6c" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.437096 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.445252 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3b9cef1-7930-44bf-9bc7-5e28f8282e4e-metrics-certs\") pod \"router-default-5444994796-jsq6c\" (UID: \"f3b9cef1-7930-44bf-9bc7-5e28f8282e4e\") " pod="openshift-ingress/router-default-5444994796-jsq6c" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.457084 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.461282 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3b9cef1-7930-44bf-9bc7-5e28f8282e4e-service-ca-bundle\") pod \"router-default-5444994796-jsq6c\" (UID: \"f3b9cef1-7930-44bf-9bc7-5e28f8282e4e\") " pod="openshift-ingress/router-default-5444994796-jsq6c" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.477183 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.496536 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.517236 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.536853 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.557072 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.576414 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.599058 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.617068 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.637526 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.657059 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.676945 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.698011 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.717655 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.737497 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.757458 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.781346 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.797483 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.817304 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.837316 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.856896 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.877189 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.897026 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.917248 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.937710 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.957821 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 27 17:39:03 crc kubenswrapper[4752]: I0227 17:39:03.979685 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.007896 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.016295 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.064999 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5kpc\" (UniqueName: \"kubernetes.io/projected/88f93b49-a038-467e-aebb-eecd7b9f307c-kube-api-access-l5kpc\") pod \"openshift-apiserver-operator-796bbdcf4f-q42ms\" (UID: \"88f93b49-a038-467e-aebb-eecd7b9f307c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q42ms" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.085333 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmmm7\" (UniqueName: \"kubernetes.io/projected/fdb18d1a-6b47-4b81-808a-f6458470a201-kube-api-access-hmmm7\") pod \"apiserver-7bbb656c7d-ccqh7\" (UID: \"fdb18d1a-6b47-4b81-808a-f6458470a201\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccqh7" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.097484 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.107798 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj7nt\" (UniqueName: \"kubernetes.io/projected/0480be6e-a859-4bd7-8aad-0a7e5bf06a0e-kube-api-access-qj7nt\") pod \"apiserver-76f77b778f-hrbkp\" (UID: \"0480be6e-a859-4bd7-8aad-0a7e5bf06a0e\") " pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.115323 4752 request.go:700] Waited for 1.009825293s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dcatalog-operator-serving-cert&limit=500&resourceVersion=0 Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.117751 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.137685 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.148356 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccqh7" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.196572 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q42ms" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.197439 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdhhf\" (UniqueName: \"kubernetes.io/projected/621fbcf2-089a-44b7-8130-e4f188d4b03f-kube-api-access-kdhhf\") pod \"route-controller-manager-6576b87f9c-x4vxw\" (UID: \"621fbcf2-089a-44b7-8130-e4f188d4b03f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4vxw" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.199948 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.218230 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.238912 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.257761 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.280481 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.298776 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.317083 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.354451 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.360549 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.378186 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.397589 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.417369 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.436866 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.457238 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.477259 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4vxw" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.477588 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.490331 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ccqh7"] Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.504750 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 27 17:39:04 crc kubenswrapper[4752]: W0227 17:39:04.505550 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdb18d1a_6b47_4b81_808a_f6458470a201.slice/crio-20833522e3afa5736a7b0192d50645a472740c5f5b66291d7f292119222c9654 WatchSource:0}: Error finding container 20833522e3afa5736a7b0192d50645a472740c5f5b66291d7f292119222c9654: Status 404 returned error can't find the container with id 20833522e3afa5736a7b0192d50645a472740c5f5b66291d7f292119222c9654 Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.516337 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q42ms"] Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.516889 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.537459 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.542418 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-hrbkp"] Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.556631 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.577619 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.597425 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.619228 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.658474 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxprw\" (UniqueName: \"kubernetes.io/projected/faf810e6-431a-40f6-b5fd-83b74fffc701-kube-api-access-xxprw\") pod \"machine-approver-56656f9798-g96w7\" (UID: \"faf810e6-431a-40f6-b5fd-83b74fffc701\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g96w7" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.675254 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2g4j\" (UniqueName: \"kubernetes.io/projected/2d1dabd3-4307-468d-86d9-01a1ac2e3539-kube-api-access-f2g4j\") pod \"console-f9d7485db-zj6td\" (UID: \"2d1dabd3-4307-468d-86d9-01a1ac2e3539\") " pod="openshift-console/console-f9d7485db-zj6td" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.683138 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.694429 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4vxw"] Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.696416 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.716562 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g96w7" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.731457 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bfde4c91-4485-402b-aef5-2ffc738ba52d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lfrdd\" (UID: \"bfde4c91-4485-402b-aef5-2ffc738ba52d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lfrdd" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.736741 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.759102 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.774828 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-zj6td" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.777686 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.788566 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lfrdd" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.796988 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.818568 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.852552 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh79f\" (UniqueName: \"kubernetes.io/projected/7d737ed6-90b8-4607-bf34-a21992a704e6-kube-api-access-kh79f\") pod \"etcd-operator-b45778765-rxzmm\" (UID: \"7d737ed6-90b8-4607-bf34-a21992a704e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rxzmm" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.860501 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.892212 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6vtk\" (UniqueName: \"kubernetes.io/projected/3083b21b-220e-4439-a3c1-18c79f073151-kube-api-access-q6vtk\") pod \"downloads-7954f5f757-cfp2v\" (UID: \"3083b21b-220e-4439-a3c1-18c79f073151\") " pod="openshift-console/downloads-7954f5f757-cfp2v" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.896517 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.933606 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnwwb\" (UniqueName: \"kubernetes.io/projected/582c125f-cc05-442d-9bc0-0e588b1dc998-kube-api-access-pnwwb\") pod \"openshift-config-operator-7777fb866f-fb9f6\" (UID: \"582c125f-cc05-442d-9bc0-0e588b1dc998\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fb9f6" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.951102 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwfwc\" (UniqueName: \"kubernetes.io/projected/89afffa7-80af-4d36-9f60-c79ad00c737f-kube-api-access-wwfwc\") pod \"console-operator-58897d9998-lhljv\" (UID: \"89afffa7-80af-4d36-9f60-c79ad00c737f\") " pod="openshift-console-operator/console-operator-58897d9998-lhljv" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.957004 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-zj6td"] Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.973552 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24vcz\" (UniqueName: \"kubernetes.io/projected/52c3bbf7-f787-4b3f-8028-cdee09aba43e-kube-api-access-24vcz\") pod \"openshift-controller-manager-operator-756b6f6bc6-mnfsk\" (UID: \"52c3bbf7-f787-4b3f-8028-cdee09aba43e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mnfsk" Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.993434 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lfrdd"] Feb 27 17:39:04 crc kubenswrapper[4752]: I0227 17:39:04.999584 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9phvv\" (UniqueName: \"kubernetes.io/projected/80cf94de-a056-4243-9ade-775eea192f3f-kube-api-access-9phvv\") pod \"authentication-operator-69f744f599-dvlkw\" (UID: \"80cf94de-a056-4243-9ade-775eea192f3f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dvlkw" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.004139 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fb9f6" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.016320 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nknrx\" (UniqueName: \"kubernetes.io/projected/127ebcb0-f31e-4857-9a67-842057dd7df4-kube-api-access-nknrx\") pod \"dns-operator-744455d44c-krm5q\" (UID: \"127ebcb0-f31e-4857-9a67-842057dd7df4\") " pod="openshift-dns-operator/dns-operator-744455d44c-krm5q" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.031889 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw7zg\" (UniqueName: \"kubernetes.io/projected/8dc5b308-08ce-4729-b854-d91947b6fce5-kube-api-access-qw7zg\") pod \"cluster-samples-operator-665b6dd947-x5tb9\" (UID: \"8dc5b308-08ce-4729-b854-d91947b6fce5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x5tb9" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.051784 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a4eceb80-1269-438b-ad35-1a125e8b98c9-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4r7b2\" (UID: \"a4eceb80-1269-438b-ad35-1a125e8b98c9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4r7b2" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.060728 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mnfsk" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.067431 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-krm5q" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.073198 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l284\" (UniqueName: \"kubernetes.io/projected/a4eceb80-1269-438b-ad35-1a125e8b98c9-kube-api-access-9l284\") pod \"cluster-image-registry-operator-dc59b4c8b-4r7b2\" (UID: \"a4eceb80-1269-438b-ad35-1a125e8b98c9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4r7b2" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.084102 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4r7b2" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.091341 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6k98\" (UniqueName: \"kubernetes.io/projected/72a3daf3-ca59-4211-9195-1b5c70e4de7c-kube-api-access-j6k98\") pod \"machine-api-operator-5694c8668f-tjktn\" (UID: \"72a3daf3-ca59-4211-9195-1b5c70e4de7c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tjktn" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.096847 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-rxzmm" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.104436 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-cfp2v" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.129208 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjgqk\" (UniqueName: \"kubernetes.io/projected/bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b-kube-api-access-kjgqk\") pod \"controller-manager-879f6c89f-tbqcp\" (UID: \"bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tbqcp" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.134260 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-lhljv" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.134670 4752 request.go:700] Waited for 1.904575826s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/secrets?fieldSelector=metadata.name%3Dcanary-serving-cert&limit=500&resourceVersion=0 Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.145259 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.145515 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-tjktn" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.151421 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-dvlkw" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.151984 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lmpr\" (UniqueName: \"kubernetes.io/projected/02863d54-8b48-4358-8dfe-b43269b1da31-kube-api-access-2lmpr\") pod \"oauth-openshift-558db77b4-8r7pq\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.157215 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.164443 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.181139 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.198022 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.201344 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tbqcp" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.237163 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpd79\" (UniqueName: \"kubernetes.io/projected/f3b9cef1-7930-44bf-9bc7-5e28f8282e4e-kube-api-access-gpd79\") pod \"router-default-5444994796-jsq6c\" (UID: \"f3b9cef1-7930-44bf-9bc7-5e28f8282e4e\") " pod="openshift-ingress/router-default-5444994796-jsq6c" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.238667 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.244088 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fb9f6"] Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.255511 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x5tb9" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.257494 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.277561 4752 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.296935 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.320889 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.335983 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.360073 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.377222 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.388844 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-krm5q"] Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.389523 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mnfsk"] Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.396458 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.412594 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-jsq6c" Feb 27 17:39:05 crc kubenswrapper[4752]: W0227 17:39:05.430397 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52c3bbf7_f787_4b3f_8028_cdee09aba43e.slice/crio-0a137b0c23a57c16adfc74178db5cc831b5d449e2d62b8a353c72f0fbd66517b WatchSource:0}: Error finding container 0a137b0c23a57c16adfc74178db5cc831b5d449e2d62b8a353c72f0fbd66517b: Status 404 returned error can't find the container with id 0a137b0c23a57c16adfc74178db5cc831b5d449e2d62b8a353c72f0fbd66517b Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.449008 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk4zr\" (UniqueName: \"kubernetes.io/projected/57573690-e945-43f5-b3ed-e3451f5a8a47-kube-api-access-fk4zr\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.449043 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b07dbba0-06fa-4b50-9ce7-76f943a9a355-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qxgv7\" (UID: \"b07dbba0-06fa-4b50-9ce7-76f943a9a355\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qxgv7" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.449131 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57573690-e945-43f5-b3ed-e3451f5a8a47-trusted-ca\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.449188 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfv7k\" (UniqueName: \"kubernetes.io/projected/ae0e3048-c296-4078-a2da-4f630f3e01bc-kube-api-access-lfv7k\") pod \"olm-operator-6b444d44fb-9s4n4\" (UID: \"ae0e3048-c296-4078-a2da-4f630f3e01bc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9s4n4" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.449216 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/020f54a0-e34f-46bf-8ec0-56da0ea1a8f8-srv-cert\") pod \"catalog-operator-68c6474976-m24gb\" (UID: \"020f54a0-e34f-46bf-8ec0-56da0ea1a8f8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m24gb" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.449242 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/57573690-e945-43f5-b3ed-e3451f5a8a47-bound-sa-token\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.449278 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/57573690-e945-43f5-b3ed-e3451f5a8a47-registry-tls\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.449294 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0bcb3ab-f03a-410d-911e-baffe86632c2-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-lcgcn\" (UID: \"f0bcb3ab-f03a-410d-911e-baffe86632c2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lcgcn" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.449311 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79g5p\" (UniqueName: \"kubernetes.io/projected/7d6b9fa5-61ff-425f-b41a-5dbf2f3f7efa-kube-api-access-79g5p\") pod \"ingress-operator-5b745b69d9-jqd62\" (UID: \"7d6b9fa5-61ff-425f-b41a-5dbf2f3f7efa\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqd62" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.449326 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0bcb3ab-f03a-410d-911e-baffe86632c2-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-lcgcn\" (UID: \"f0bcb3ab-f03a-410d-911e-baffe86632c2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lcgcn" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.449351 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/57573690-e945-43f5-b3ed-e3451f5a8a47-installation-pull-secrets\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.449450 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djmg2\" (UniqueName: \"kubernetes.io/projected/5992a5c1-ae6c-4bae-98b4-ad86a24b2a4e-kube-api-access-djmg2\") pod \"control-plane-machine-set-operator-78cbb6b69f-5pwkv\" (UID: \"5992a5c1-ae6c-4bae-98b4-ad86a24b2a4e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pwkv" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.449586 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bf9d110c-49da-44fc-b366-308548c190ad-proxy-tls\") pod \"machine-config-controller-84d6567774-4psjf\" (UID: \"bf9d110c-49da-44fc-b366-308548c190ad\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4psjf" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.449667 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d6b9fa5-61ff-425f-b41a-5dbf2f3f7efa-bound-sa-token\") pod \"ingress-operator-5b745b69d9-jqd62\" (UID: \"7d6b9fa5-61ff-425f-b41a-5dbf2f3f7efa\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqd62" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.449688 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ae0e3048-c296-4078-a2da-4f630f3e01bc-profile-collector-cert\") pod \"olm-operator-6b444d44fb-9s4n4\" (UID: \"ae0e3048-c296-4078-a2da-4f630f3e01bc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9s4n4" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.449705 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/020f54a0-e34f-46bf-8ec0-56da0ea1a8f8-profile-collector-cert\") pod \"catalog-operator-68c6474976-m24gb\" (UID: \"020f54a0-e34f-46bf-8ec0-56da0ea1a8f8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m24gb" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.449733 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3047a3c-0344-4f73-a4d1-5f2c278fa1b8-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-lps47\" (UID: \"a3047a3c-0344-4f73-a4d1-5f2c278fa1b8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lps47" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.449753 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9xjd\" (UniqueName: \"kubernetes.io/projected/f0bcb3ab-f03a-410d-911e-baffe86632c2-kube-api-access-m9xjd\") pod \"kube-storage-version-migrator-operator-b67b599dd-lcgcn\" (UID: \"f0bcb3ab-f03a-410d-911e-baffe86632c2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lcgcn" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.449822 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b07dbba0-06fa-4b50-9ce7-76f943a9a355-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qxgv7\" (UID: \"b07dbba0-06fa-4b50-9ce7-76f943a9a355\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qxgv7" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.449977 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3047a3c-0344-4f73-a4d1-5f2c278fa1b8-config\") pod \"kube-apiserver-operator-766d6c64bb-lps47\" (UID: \"a3047a3c-0344-4f73-a4d1-5f2c278fa1b8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lps47" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.450055 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d6b9fa5-61ff-425f-b41a-5dbf2f3f7efa-metrics-tls\") pod \"ingress-operator-5b745b69d9-jqd62\" (UID: \"7d6b9fa5-61ff-425f-b41a-5dbf2f3f7efa\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqd62" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.450136 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ae0e3048-c296-4078-a2da-4f630f3e01bc-srv-cert\") pod \"olm-operator-6b444d44fb-9s4n4\" (UID: \"ae0e3048-c296-4078-a2da-4f630f3e01bc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9s4n4" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.450181 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2b7f\" (UniqueName: \"kubernetes.io/projected/bf9d110c-49da-44fc-b366-308548c190ad-kube-api-access-f2b7f\") pod \"machine-config-controller-84d6567774-4psjf\" (UID: \"bf9d110c-49da-44fc-b366-308548c190ad\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4psjf" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.450226 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/57573690-e945-43f5-b3ed-e3451f5a8a47-registry-certificates\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.450253 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b07dbba0-06fa-4b50-9ce7-76f943a9a355-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qxgv7\" (UID: \"b07dbba0-06fa-4b50-9ce7-76f943a9a355\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qxgv7" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.450298 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.450704 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a3047a3c-0344-4f73-a4d1-5f2c278fa1b8-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-lps47\" (UID: \"a3047a3c-0344-4f73-a4d1-5f2c278fa1b8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lps47" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.450921 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d6b9fa5-61ff-425f-b41a-5dbf2f3f7efa-trusted-ca\") pod \"ingress-operator-5b745b69d9-jqd62\" (UID: \"7d6b9fa5-61ff-425f-b41a-5dbf2f3f7efa\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqd62" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.450992 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5992a5c1-ae6c-4bae-98b4-ad86a24b2a4e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-5pwkv\" (UID: \"5992a5c1-ae6c-4bae-98b4-ad86a24b2a4e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pwkv" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.451020 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/57573690-e945-43f5-b3ed-e3451f5a8a47-ca-trust-extracted\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.451088 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qpgd\" (UniqueName: \"kubernetes.io/projected/020f54a0-e34f-46bf-8ec0-56da0ea1a8f8-kube-api-access-6qpgd\") pod \"catalog-operator-68c6474976-m24gb\" (UID: \"020f54a0-e34f-46bf-8ec0-56da0ea1a8f8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m24gb" Feb 27 17:39:05 crc kubenswrapper[4752]: E0227 17:39:05.451304 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:05.951287735 +0000 UTC m=+245.858104596 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.451121 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bf9d110c-49da-44fc-b366-308548c190ad-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-4psjf\" (UID: \"bf9d110c-49da-44fc-b366-308548c190ad\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4psjf" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.459490 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mnfsk" event={"ID":"52c3bbf7-f787-4b3f-8028-cdee09aba43e","Type":"ContainerStarted","Data":"0a137b0c23a57c16adfc74178db5cc831b5d449e2d62b8a353c72f0fbd66517b"} Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.470849 4752 generic.go:334] "Generic (PLEG): container finished" podID="fdb18d1a-6b47-4b81-808a-f6458470a201" containerID="8158fa50bce11bbc391e6371af0241091b08666176efd36e19de616e16b428fb" exitCode=0 Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.470934 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccqh7" event={"ID":"fdb18d1a-6b47-4b81-808a-f6458470a201","Type":"ContainerDied","Data":"8158fa50bce11bbc391e6371af0241091b08666176efd36e19de616e16b428fb"} Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.470964 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccqh7" event={"ID":"fdb18d1a-6b47-4b81-808a-f6458470a201","Type":"ContainerStarted","Data":"20833522e3afa5736a7b0192d50645a472740c5f5b66291d7f292119222c9654"} Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.473637 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lfrdd" event={"ID":"bfde4c91-4485-402b-aef5-2ffc738ba52d","Type":"ContainerStarted","Data":"1b75ebda41999d419c6f33ac56155668c29095daf00ed12f8964d7ff0047407b"} Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.473739 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lfrdd" event={"ID":"bfde4c91-4485-402b-aef5-2ffc738ba52d","Type":"ContainerStarted","Data":"b448be7a2c9fcb9457842dd9780c522f3d6fa9f0c0e5e1406bd91fb8c7c451a3"} Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.476943 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q42ms" event={"ID":"88f93b49-a038-467e-aebb-eecd7b9f307c","Type":"ContainerStarted","Data":"4f24e253d05cb453222a4949d0eb5aae01b5613adab93fa41147b0f6d38975bc"} Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.476992 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q42ms" event={"ID":"88f93b49-a038-467e-aebb-eecd7b9f307c","Type":"ContainerStarted","Data":"17515e33fc7460592ff57a80604ac8a9cb17b211ba2fcbcd9defa83a5514f0ff"} Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.483303 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-zj6td" event={"ID":"2d1dabd3-4307-468d-86d9-01a1ac2e3539","Type":"ContainerStarted","Data":"64a64a525e4c1c18f559a7fddd6b76851042982881904e412a58e19cb5a1329e"} Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.483353 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-zj6td" event={"ID":"2d1dabd3-4307-468d-86d9-01a1ac2e3539","Type":"ContainerStarted","Data":"43f8e47388658cac77045d5b07b88f1a9a72db63db112ac5d99816c4ead09662"} Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.490604 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g96w7" event={"ID":"faf810e6-431a-40f6-b5fd-83b74fffc701","Type":"ContainerStarted","Data":"8e1032d1a82074fa5fe552d80a2f00b8a97be412268d7f9ce65ec7cd682f2d12"} Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.490667 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g96w7" event={"ID":"faf810e6-431a-40f6-b5fd-83b74fffc701","Type":"ContainerStarted","Data":"d6c465764b1a15bbe42f66c9f1791048b6b9ce43abf28627d85ee587afc6f324"} Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.490679 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g96w7" event={"ID":"faf810e6-431a-40f6-b5fd-83b74fffc701","Type":"ContainerStarted","Data":"8adf08c4aa2c836161411d3e84efb9af8308e2f3ca11af86a8e8ab1299fa86ba"} Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.496267 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4vxw" event={"ID":"621fbcf2-089a-44b7-8130-e4f188d4b03f","Type":"ContainerStarted","Data":"fe8745c17a10cae50196d9cd0f9a1b4a3447c8425188f03ffe0b51de6333fe83"} Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.496326 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4vxw" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.496337 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4vxw" event={"ID":"621fbcf2-089a-44b7-8130-e4f188d4b03f","Type":"ContainerStarted","Data":"30876044b986f7c558a97b36834e4a9f1187f434ec8f3e1c3bf35165dce044bd"} Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.514030 4752 generic.go:334] "Generic (PLEG): container finished" podID="0480be6e-a859-4bd7-8aad-0a7e5bf06a0e" containerID="afe4a1701fd8ff7a4d1f5b2eceaf2d5c9d0fe46be33439dc2181310bae224aaf" exitCode=0 Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.514388 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" event={"ID":"0480be6e-a859-4bd7-8aad-0a7e5bf06a0e","Type":"ContainerDied","Data":"afe4a1701fd8ff7a4d1f5b2eceaf2d5c9d0fe46be33439dc2181310bae224aaf"} Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.514461 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" event={"ID":"0480be6e-a859-4bd7-8aad-0a7e5bf06a0e","Type":"ContainerStarted","Data":"94497a2a81903a33c0bc3e11231bf697a3bc508b471709f8f97ae8d89a7563cd"} Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.523803 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fb9f6" event={"ID":"582c125f-cc05-442d-9bc0-0e588b1dc998","Type":"ContainerStarted","Data":"40a7cf612913cb3ce7949e61a6e1f80d2d430ddfc4c3dccc2baeadcc8d718ab8"} Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.556111 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.557378 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/081ebfd1-71a9-470a-8f73-9a673f6bcb9b-registration-dir\") pod \"csi-hostpathplugin-xl44c\" (UID: \"081ebfd1-71a9-470a-8f73-9a673f6bcb9b\") " pod="hostpath-provisioner/csi-hostpathplugin-xl44c" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.557420 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lnvt\" (UniqueName: \"kubernetes.io/projected/081ebfd1-71a9-470a-8f73-9a673f6bcb9b-kube-api-access-7lnvt\") pod \"csi-hostpathplugin-xl44c\" (UID: \"081ebfd1-71a9-470a-8f73-9a673f6bcb9b\") " pod="hostpath-provisioner/csi-hostpathplugin-xl44c" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.557485 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d1444448-daeb-4e89-8b0c-2c97127b00c2-apiservice-cert\") pod \"packageserver-d55dfcdfc-4vgtn\" (UID: \"d1444448-daeb-4e89-8b0c-2c97127b00c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vgtn" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.557571 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g4wm\" (UniqueName: \"kubernetes.io/projected/8a2327c4-0233-4973-927c-5b434e75ece4-kube-api-access-6g4wm\") pod \"machine-config-server-kbhp4\" (UID: \"8a2327c4-0233-4973-927c-5b434e75ece4\") " pod="openshift-machine-config-operator/machine-config-server-kbhp4" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.557653 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgllq\" (UniqueName: \"kubernetes.io/projected/e61b5d7e-db3b-49d0-94de-95a19c8fc89b-kube-api-access-kgllq\") pod \"service-ca-operator-777779d784-d5rp7\" (UID: \"e61b5d7e-db3b-49d0-94de-95a19c8fc89b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-d5rp7" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.557709 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d6b9fa5-61ff-425f-b41a-5dbf2f3f7efa-trusted-ca\") pod \"ingress-operator-5b745b69d9-jqd62\" (UID: \"7d6b9fa5-61ff-425f-b41a-5dbf2f3f7efa\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqd62" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.557739 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e61b5d7e-db3b-49d0-94de-95a19c8fc89b-serving-cert\") pod \"service-ca-operator-777779d784-d5rp7\" (UID: \"e61b5d7e-db3b-49d0-94de-95a19c8fc89b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-d5rp7" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.557781 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/081ebfd1-71a9-470a-8f73-9a673f6bcb9b-socket-dir\") pod \"csi-hostpathplugin-xl44c\" (UID: \"081ebfd1-71a9-470a-8f73-9a673f6bcb9b\") " pod="hostpath-provisioner/csi-hostpathplugin-xl44c" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.557803 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e8dcbe6-373e-4c76-93fa-d30b75dd50db-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-dmrgx\" (UID: \"9e8dcbe6-373e-4c76-93fa-d30b75dd50db\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dmrgx" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.557832 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d1444448-daeb-4e89-8b0c-2c97127b00c2-webhook-cert\") pod \"packageserver-d55dfcdfc-4vgtn\" (UID: \"d1444448-daeb-4e89-8b0c-2c97127b00c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vgtn" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.557900 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5992a5c1-ae6c-4bae-98b4-ad86a24b2a4e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-5pwkv\" (UID: \"5992a5c1-ae6c-4bae-98b4-ad86a24b2a4e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pwkv" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.557954 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/081ebfd1-71a9-470a-8f73-9a673f6bcb9b-csi-data-dir\") pod \"csi-hostpathplugin-xl44c\" (UID: \"081ebfd1-71a9-470a-8f73-9a673f6bcb9b\") " pod="hostpath-provisioner/csi-hostpathplugin-xl44c" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.557983 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1dcb36df-1b47-47d4-933c-24498112a4a6-secret-volume\") pod \"collect-profiles-29536890-gjpc2\" (UID: \"1dcb36df-1b47-47d4-933c-24498112a4a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536890-gjpc2" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.558043 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/57573690-e945-43f5-b3ed-e3451f5a8a47-ca-trust-extracted\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.558101 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qpgd\" (UniqueName: \"kubernetes.io/projected/020f54a0-e34f-46bf-8ec0-56da0ea1a8f8-kube-api-access-6qpgd\") pod \"catalog-operator-68c6474976-m24gb\" (UID: \"020f54a0-e34f-46bf-8ec0-56da0ea1a8f8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m24gb" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.558132 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d54g\" (UniqueName: \"kubernetes.io/projected/1dcb36df-1b47-47d4-933c-24498112a4a6-kube-api-access-4d54g\") pod \"collect-profiles-29536890-gjpc2\" (UID: \"1dcb36df-1b47-47d4-933c-24498112a4a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536890-gjpc2" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.558196 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bf9d110c-49da-44fc-b366-308548c190ad-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-4psjf\" (UID: \"bf9d110c-49da-44fc-b366-308548c190ad\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4psjf" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.558254 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk4zr\" (UniqueName: \"kubernetes.io/projected/57573690-e945-43f5-b3ed-e3451f5a8a47-kube-api-access-fk4zr\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.558284 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b07dbba0-06fa-4b50-9ce7-76f943a9a355-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qxgv7\" (UID: \"b07dbba0-06fa-4b50-9ce7-76f943a9a355\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qxgv7" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.558308 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/455fc602-83ab-4544-bfa2-01e0b35bc8dc-signing-key\") pod \"service-ca-9c57cc56f-5vcfl\" (UID: \"455fc602-83ab-4544-bfa2-01e0b35bc8dc\") " pod="openshift-service-ca/service-ca-9c57cc56f-5vcfl" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.558355 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9djm\" (UniqueName: \"kubernetes.io/projected/cc36acda-9447-479d-b741-c063ecb91f3e-kube-api-access-r9djm\") pod \"auto-csr-approver-29536898-598km\" (UID: \"cc36acda-9447-479d-b741-c063ecb91f3e\") " pod="openshift-infra/auto-csr-approver-29536898-598km" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.558385 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zz78\" (UniqueName: \"kubernetes.io/projected/506337c9-3cdd-451f-a015-6d3e25d43c22-kube-api-access-7zz78\") pod \"multus-admission-controller-857f4d67dd-hbfsp\" (UID: \"506337c9-3cdd-451f-a015-6d3e25d43c22\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hbfsp" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.558460 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57573690-e945-43f5-b3ed-e3451f5a8a47-trusted-ca\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.558494 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvjbk\" (UniqueName: \"kubernetes.io/projected/856bbe2c-3e10-42fb-ac3c-470e0057bbaf-kube-api-access-nvjbk\") pod \"machine-config-operator-74547568cd-vz9rd\" (UID: \"856bbe2c-3e10-42fb-ac3c-470e0057bbaf\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vz9rd" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.558539 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfv7k\" (UniqueName: \"kubernetes.io/projected/ae0e3048-c296-4078-a2da-4f630f3e01bc-kube-api-access-lfv7k\") pod \"olm-operator-6b444d44fb-9s4n4\" (UID: \"ae0e3048-c296-4078-a2da-4f630f3e01bc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9s4n4" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.558568 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/020f54a0-e34f-46bf-8ec0-56da0ea1a8f8-srv-cert\") pod \"catalog-operator-68c6474976-m24gb\" (UID: \"020f54a0-e34f-46bf-8ec0-56da0ea1a8f8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m24gb" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.558597 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8a2327c4-0233-4973-927c-5b434e75ece4-node-bootstrap-token\") pod \"machine-config-server-kbhp4\" (UID: \"8a2327c4-0233-4973-927c-5b434e75ece4\") " pod="openshift-machine-config-operator/machine-config-server-kbhp4" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.558655 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/57573690-e945-43f5-b3ed-e3451f5a8a47-bound-sa-token\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.558692 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0bcb3ab-f03a-410d-911e-baffe86632c2-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-lcgcn\" (UID: \"f0bcb3ab-f03a-410d-911e-baffe86632c2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lcgcn" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.558715 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/856bbe2c-3e10-42fb-ac3c-470e0057bbaf-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vz9rd\" (UID: \"856bbe2c-3e10-42fb-ac3c-470e0057bbaf\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vz9rd" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.558756 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/57573690-e945-43f5-b3ed-e3451f5a8a47-registry-tls\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.558900 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79g5p\" (UniqueName: \"kubernetes.io/projected/7d6b9fa5-61ff-425f-b41a-5dbf2f3f7efa-kube-api-access-79g5p\") pod \"ingress-operator-5b745b69d9-jqd62\" (UID: \"7d6b9fa5-61ff-425f-b41a-5dbf2f3f7efa\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqd62" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.558933 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0bcb3ab-f03a-410d-911e-baffe86632c2-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-lcgcn\" (UID: \"f0bcb3ab-f03a-410d-911e-baffe86632c2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lcgcn" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.558991 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/57573690-e945-43f5-b3ed-e3451f5a8a47-installation-pull-secrets\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.559029 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d1444448-daeb-4e89-8b0c-2c97127b00c2-tmpfs\") pod \"packageserver-d55dfcdfc-4vgtn\" (UID: \"d1444448-daeb-4e89-8b0c-2c97127b00c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vgtn" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.559056 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/455fc602-83ab-4544-bfa2-01e0b35bc8dc-signing-cabundle\") pod \"service-ca-9c57cc56f-5vcfl\" (UID: \"455fc602-83ab-4544-bfa2-01e0b35bc8dc\") " pod="openshift-service-ca/service-ca-9c57cc56f-5vcfl" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.559102 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb16b639-2f9c-414f-8cae-41f805a10165-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fcc4l\" (UID: \"bb16b639-2f9c-414f-8cae-41f805a10165\") " pod="openshift-marketplace/marketplace-operator-79b997595-fcc4l" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.559398 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/57573690-e945-43f5-b3ed-e3451f5a8a47-ca-trust-extracted\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.561077 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bb16b639-2f9c-414f-8cae-41f805a10165-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fcc4l\" (UID: \"bb16b639-2f9c-414f-8cae-41f805a10165\") " pod="openshift-marketplace/marketplace-operator-79b997595-fcc4l" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.561166 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl84j\" (UniqueName: \"kubernetes.io/projected/d1444448-daeb-4e89-8b0c-2c97127b00c2-kube-api-access-fl84j\") pod \"packageserver-d55dfcdfc-4vgtn\" (UID: \"d1444448-daeb-4e89-8b0c-2c97127b00c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vgtn" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.561242 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djmg2\" (UniqueName: \"kubernetes.io/projected/5992a5c1-ae6c-4bae-98b4-ad86a24b2a4e-kube-api-access-djmg2\") pod \"control-plane-machine-set-operator-78cbb6b69f-5pwkv\" (UID: \"5992a5c1-ae6c-4bae-98b4-ad86a24b2a4e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pwkv" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.561328 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/081ebfd1-71a9-470a-8f73-9a673f6bcb9b-mountpoint-dir\") pod \"csi-hostpathplugin-xl44c\" (UID: \"081ebfd1-71a9-470a-8f73-9a673f6bcb9b\") " pod="hostpath-provisioner/csi-hostpathplugin-xl44c" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.561407 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bf9d110c-49da-44fc-b366-308548c190ad-proxy-tls\") pod \"machine-config-controller-84d6567774-4psjf\" (UID: \"bf9d110c-49da-44fc-b366-308548c190ad\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4psjf" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.561492 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8a2327c4-0233-4973-927c-5b434e75ece4-certs\") pod \"machine-config-server-kbhp4\" (UID: \"8a2327c4-0233-4973-927c-5b434e75ece4\") " pod="openshift-machine-config-operator/machine-config-server-kbhp4" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.561535 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/856bbe2c-3e10-42fb-ac3c-470e0057bbaf-proxy-tls\") pod \"machine-config-operator-74547568cd-vz9rd\" (UID: \"856bbe2c-3e10-42fb-ac3c-470e0057bbaf\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vz9rd" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.561645 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d6b9fa5-61ff-425f-b41a-5dbf2f3f7efa-bound-sa-token\") pod \"ingress-operator-5b745b69d9-jqd62\" (UID: \"7d6b9fa5-61ff-425f-b41a-5dbf2f3f7efa\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqd62" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.561698 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/020f54a0-e34f-46bf-8ec0-56da0ea1a8f8-profile-collector-cert\") pod \"catalog-operator-68c6474976-m24gb\" (UID: \"020f54a0-e34f-46bf-8ec0-56da0ea1a8f8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m24gb" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.561738 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ae0e3048-c296-4078-a2da-4f630f3e01bc-profile-collector-cert\") pod \"olm-operator-6b444d44fb-9s4n4\" (UID: \"ae0e3048-c296-4078-a2da-4f630f3e01bc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9s4n4" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.561787 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/081ebfd1-71a9-470a-8f73-9a673f6bcb9b-plugins-dir\") pod \"csi-hostpathplugin-xl44c\" (UID: \"081ebfd1-71a9-470a-8f73-9a673f6bcb9b\") " pod="hostpath-provisioner/csi-hostpathplugin-xl44c" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.561822 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj9ts\" (UniqueName: \"kubernetes.io/projected/e372223f-91ea-40f7-93f8-38bb0a08c646-kube-api-access-dj9ts\") pod \"dns-default-rjvxc\" (UID: \"e372223f-91ea-40f7-93f8-38bb0a08c646\") " pod="openshift-dns/dns-default-rjvxc" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.561852 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e61b5d7e-db3b-49d0-94de-95a19c8fc89b-config\") pod \"service-ca-operator-777779d784-d5rp7\" (UID: \"e61b5d7e-db3b-49d0-94de-95a19c8fc89b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-d5rp7" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.561892 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3047a3c-0344-4f73-a4d1-5f2c278fa1b8-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-lps47\" (UID: \"a3047a3c-0344-4f73-a4d1-5f2c278fa1b8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lps47" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.561924 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9xjd\" (UniqueName: \"kubernetes.io/projected/f0bcb3ab-f03a-410d-911e-baffe86632c2-kube-api-access-m9xjd\") pod \"kube-storage-version-migrator-operator-b67b599dd-lcgcn\" (UID: \"f0bcb3ab-f03a-410d-911e-baffe86632c2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lcgcn" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.561934 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bf9d110c-49da-44fc-b366-308548c190ad-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-4psjf\" (UID: \"bf9d110c-49da-44fc-b366-308548c190ad\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4psjf" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.564687 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d592b\" (UniqueName: \"kubernetes.io/projected/bb16b639-2f9c-414f-8cae-41f805a10165-kube-api-access-d592b\") pod \"marketplace-operator-79b997595-fcc4l\" (UID: \"bb16b639-2f9c-414f-8cae-41f805a10165\") " pod="openshift-marketplace/marketplace-operator-79b997595-fcc4l" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.567183 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57573690-e945-43f5-b3ed-e3451f5a8a47-trusted-ca\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.572412 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d6b9fa5-61ff-425f-b41a-5dbf2f3f7efa-trusted-ca\") pod \"ingress-operator-5b745b69d9-jqd62\" (UID: \"7d6b9fa5-61ff-425f-b41a-5dbf2f3f7efa\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqd62" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.575502 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b07dbba0-06fa-4b50-9ce7-76f943a9a355-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qxgv7\" (UID: \"b07dbba0-06fa-4b50-9ce7-76f943a9a355\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qxgv7" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.575844 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1dcb36df-1b47-47d4-933c-24498112a4a6-config-volume\") pod \"collect-profiles-29536890-gjpc2\" (UID: \"1dcb36df-1b47-47d4-933c-24498112a4a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536890-gjpc2" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.575960 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b07dbba0-06fa-4b50-9ce7-76f943a9a355-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qxgv7\" (UID: \"b07dbba0-06fa-4b50-9ce7-76f943a9a355\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qxgv7" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.576022 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/506337c9-3cdd-451f-a015-6d3e25d43c22-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hbfsp\" (UID: \"506337c9-3cdd-451f-a015-6d3e25d43c22\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hbfsp" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.576063 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmjmz\" (UniqueName: \"kubernetes.io/projected/b86e38ee-36f1-4bf4-86c0-3b13b1d95103-kube-api-access-lmjmz\") pod \"migrator-59844c95c7-s8s94\" (UID: \"b86e38ee-36f1-4bf4-86c0-3b13b1d95103\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s8s94" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.576090 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmqkl\" (UniqueName: \"kubernetes.io/projected/9e8dcbe6-373e-4c76-93fa-d30b75dd50db-kube-api-access-rmqkl\") pod \"package-server-manager-789f6589d5-dmrgx\" (UID: \"9e8dcbe6-373e-4c76-93fa-d30b75dd50db\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dmrgx" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.576123 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3047a3c-0344-4f73-a4d1-5f2c278fa1b8-config\") pod \"kube-apiserver-operator-766d6c64bb-lps47\" (UID: \"a3047a3c-0344-4f73-a4d1-5f2c278fa1b8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lps47" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.576170 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjlc7\" (UniqueName: \"kubernetes.io/projected/b3cad9bd-0350-4262-86a0-cdb0f3a776ad-kube-api-access-cjlc7\") pod \"ingress-canary-dm4md\" (UID: \"b3cad9bd-0350-4262-86a0-cdb0f3a776ad\") " pod="openshift-ingress-canary/ingress-canary-dm4md" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.576201 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d6b9fa5-61ff-425f-b41a-5dbf2f3f7efa-metrics-tls\") pod \"ingress-operator-5b745b69d9-jqd62\" (UID: \"7d6b9fa5-61ff-425f-b41a-5dbf2f3f7efa\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqd62" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.576223 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/856bbe2c-3e10-42fb-ac3c-470e0057bbaf-images\") pod \"machine-config-operator-74547568cd-vz9rd\" (UID: \"856bbe2c-3e10-42fb-ac3c-470e0057bbaf\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vz9rd" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.576248 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e372223f-91ea-40f7-93f8-38bb0a08c646-metrics-tls\") pod \"dns-default-rjvxc\" (UID: \"e372223f-91ea-40f7-93f8-38bb0a08c646\") " pod="openshift-dns/dns-default-rjvxc" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.576273 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b3cad9bd-0350-4262-86a0-cdb0f3a776ad-cert\") pod \"ingress-canary-dm4md\" (UID: \"b3cad9bd-0350-4262-86a0-cdb0f3a776ad\") " pod="openshift-ingress-canary/ingress-canary-dm4md" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.576302 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk8rv\" (UniqueName: \"kubernetes.io/projected/455fc602-83ab-4544-bfa2-01e0b35bc8dc-kube-api-access-jk8rv\") pod \"service-ca-9c57cc56f-5vcfl\" (UID: \"455fc602-83ab-4544-bfa2-01e0b35bc8dc\") " pod="openshift-service-ca/service-ca-9c57cc56f-5vcfl" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.581135 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5992a5c1-ae6c-4bae-98b4-ad86a24b2a4e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-5pwkv\" (UID: \"5992a5c1-ae6c-4bae-98b4-ad86a24b2a4e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pwkv" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.581980 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e372223f-91ea-40f7-93f8-38bb0a08c646-config-volume\") pod \"dns-default-rjvxc\" (UID: \"e372223f-91ea-40f7-93f8-38bb0a08c646\") " pod="openshift-dns/dns-default-rjvxc" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.582280 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/57573690-e945-43f5-b3ed-e3451f5a8a47-registry-certificates\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.582419 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ae0e3048-c296-4078-a2da-4f630f3e01bc-srv-cert\") pod \"olm-operator-6b444d44fb-9s4n4\" (UID: \"ae0e3048-c296-4078-a2da-4f630f3e01bc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9s4n4" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.582543 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2b7f\" (UniqueName: \"kubernetes.io/projected/bf9d110c-49da-44fc-b366-308548c190ad-kube-api-access-f2b7f\") pod \"machine-config-controller-84d6567774-4psjf\" (UID: \"bf9d110c-49da-44fc-b366-308548c190ad\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4psjf" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.582624 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3047a3c-0344-4f73-a4d1-5f2c278fa1b8-config\") pod \"kube-apiserver-operator-766d6c64bb-lps47\" (UID: \"a3047a3c-0344-4f73-a4d1-5f2c278fa1b8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lps47" Feb 27 17:39:05 crc kubenswrapper[4752]: E0227 17:39:05.582859 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:06.082829153 +0000 UTC m=+245.989646004 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.584124 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0bcb3ab-f03a-410d-911e-baffe86632c2-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-lcgcn\" (UID: \"f0bcb3ab-f03a-410d-911e-baffe86632c2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lcgcn" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.589651 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3047a3c-0344-4f73-a4d1-5f2c278fa1b8-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-lps47\" (UID: \"a3047a3c-0344-4f73-a4d1-5f2c278fa1b8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lps47" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.590341 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bf9d110c-49da-44fc-b366-308548c190ad-proxy-tls\") pod \"machine-config-controller-84d6567774-4psjf\" (UID: \"bf9d110c-49da-44fc-b366-308548c190ad\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4psjf" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.593782 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0bcb3ab-f03a-410d-911e-baffe86632c2-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-lcgcn\" (UID: \"f0bcb3ab-f03a-410d-911e-baffe86632c2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lcgcn" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.594366 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/020f54a0-e34f-46bf-8ec0-56da0ea1a8f8-srv-cert\") pod \"catalog-operator-68c6474976-m24gb\" (UID: \"020f54a0-e34f-46bf-8ec0-56da0ea1a8f8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m24gb" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.595094 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.595129 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b07dbba0-06fa-4b50-9ce7-76f943a9a355-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qxgv7\" (UID: \"b07dbba0-06fa-4b50-9ce7-76f943a9a355\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qxgv7" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.595294 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a3047a3c-0344-4f73-a4d1-5f2c278fa1b8-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-lps47\" (UID: \"a3047a3c-0344-4f73-a4d1-5f2c278fa1b8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lps47" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.595438 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/57573690-e945-43f5-b3ed-e3451f5a8a47-registry-certificates\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:05 crc kubenswrapper[4752]: E0227 17:39:05.595936 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:06.095906576 +0000 UTC m=+246.002723627 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.605828 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ae0e3048-c296-4078-a2da-4f630f3e01bc-srv-cert\") pod \"olm-operator-6b444d44fb-9s4n4\" (UID: \"ae0e3048-c296-4078-a2da-4f630f3e01bc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9s4n4" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.614802 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qpgd\" (UniqueName: \"kubernetes.io/projected/020f54a0-e34f-46bf-8ec0-56da0ea1a8f8-kube-api-access-6qpgd\") pod \"catalog-operator-68c6474976-m24gb\" (UID: \"020f54a0-e34f-46bf-8ec0-56da0ea1a8f8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m24gb" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.615403 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/57573690-e945-43f5-b3ed-e3451f5a8a47-registry-tls\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.615767 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7d6b9fa5-61ff-425f-b41a-5dbf2f3f7efa-metrics-tls\") pod \"ingress-operator-5b745b69d9-jqd62\" (UID: \"7d6b9fa5-61ff-425f-b41a-5dbf2f3f7efa\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqd62" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.616496 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b07dbba0-06fa-4b50-9ce7-76f943a9a355-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qxgv7\" (UID: \"b07dbba0-06fa-4b50-9ce7-76f943a9a355\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qxgv7" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.632279 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ae0e3048-c296-4078-a2da-4f630f3e01bc-profile-collector-cert\") pod \"olm-operator-6b444d44fb-9s4n4\" (UID: \"ae0e3048-c296-4078-a2da-4f630f3e01bc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9s4n4" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.632515 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/57573690-e945-43f5-b3ed-e3451f5a8a47-installation-pull-secrets\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.633165 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/020f54a0-e34f-46bf-8ec0-56da0ea1a8f8-profile-collector-cert\") pod \"catalog-operator-68c6474976-m24gb\" (UID: \"020f54a0-e34f-46bf-8ec0-56da0ea1a8f8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m24gb" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.637724 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk4zr\" (UniqueName: \"kubernetes.io/projected/57573690-e945-43f5-b3ed-e3451f5a8a47-kube-api-access-fk4zr\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.657712 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/57573690-e945-43f5-b3ed-e3451f5a8a47-bound-sa-token\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.670246 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d6b9fa5-61ff-425f-b41a-5dbf2f3f7efa-bound-sa-token\") pod \"ingress-operator-5b745b69d9-jqd62\" (UID: \"7d6b9fa5-61ff-425f-b41a-5dbf2f3f7efa\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqd62" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.690256 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-tjktn"] Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.691791 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4r7b2"] Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.698075 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9xjd\" (UniqueName: \"kubernetes.io/projected/f0bcb3ab-f03a-410d-911e-baffe86632c2-kube-api-access-m9xjd\") pod \"kube-storage-version-migrator-operator-b67b599dd-lcgcn\" (UID: \"f0bcb3ab-f03a-410d-911e-baffe86632c2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lcgcn" Feb 27 17:39:05 crc kubenswrapper[4752]: E0227 17:39:05.698120 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:06.198089999 +0000 UTC m=+246.104906840 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.698296 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.698563 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/856bbe2c-3e10-42fb-ac3c-470e0057bbaf-proxy-tls\") pod \"machine-config-operator-74547568cd-vz9rd\" (UID: \"856bbe2c-3e10-42fb-ac3c-470e0057bbaf\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vz9rd" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.698930 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e61b5d7e-db3b-49d0-94de-95a19c8fc89b-config\") pod \"service-ca-operator-777779d784-d5rp7\" (UID: \"e61b5d7e-db3b-49d0-94de-95a19c8fc89b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-d5rp7" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.698955 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/081ebfd1-71a9-470a-8f73-9a673f6bcb9b-plugins-dir\") pod \"csi-hostpathplugin-xl44c\" (UID: \"081ebfd1-71a9-470a-8f73-9a673f6bcb9b\") " pod="hostpath-provisioner/csi-hostpathplugin-xl44c" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.698991 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj9ts\" (UniqueName: \"kubernetes.io/projected/e372223f-91ea-40f7-93f8-38bb0a08c646-kube-api-access-dj9ts\") pod \"dns-default-rjvxc\" (UID: \"e372223f-91ea-40f7-93f8-38bb0a08c646\") " pod="openshift-dns/dns-default-rjvxc" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.699014 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d592b\" (UniqueName: \"kubernetes.io/projected/bb16b639-2f9c-414f-8cae-41f805a10165-kube-api-access-d592b\") pod \"marketplace-operator-79b997595-fcc4l\" (UID: \"bb16b639-2f9c-414f-8cae-41f805a10165\") " pod="openshift-marketplace/marketplace-operator-79b997595-fcc4l" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.699032 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1dcb36df-1b47-47d4-933c-24498112a4a6-config-volume\") pod \"collect-profiles-29536890-gjpc2\" (UID: \"1dcb36df-1b47-47d4-933c-24498112a4a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536890-gjpc2" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.700233 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/506337c9-3cdd-451f-a015-6d3e25d43c22-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hbfsp\" (UID: \"506337c9-3cdd-451f-a015-6d3e25d43c22\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hbfsp" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.700267 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmjmz\" (UniqueName: \"kubernetes.io/projected/b86e38ee-36f1-4bf4-86c0-3b13b1d95103-kube-api-access-lmjmz\") pod \"migrator-59844c95c7-s8s94\" (UID: \"b86e38ee-36f1-4bf4-86c0-3b13b1d95103\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s8s94" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.700302 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmqkl\" (UniqueName: \"kubernetes.io/projected/9e8dcbe6-373e-4c76-93fa-d30b75dd50db-kube-api-access-rmqkl\") pod \"package-server-manager-789f6589d5-dmrgx\" (UID: \"9e8dcbe6-373e-4c76-93fa-d30b75dd50db\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dmrgx" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.700324 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjlc7\" (UniqueName: \"kubernetes.io/projected/b3cad9bd-0350-4262-86a0-cdb0f3a776ad-kube-api-access-cjlc7\") pod \"ingress-canary-dm4md\" (UID: \"b3cad9bd-0350-4262-86a0-cdb0f3a776ad\") " pod="openshift-ingress-canary/ingress-canary-dm4md" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.700363 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/856bbe2c-3e10-42fb-ac3c-470e0057bbaf-images\") pod \"machine-config-operator-74547568cd-vz9rd\" (UID: \"856bbe2c-3e10-42fb-ac3c-470e0057bbaf\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vz9rd" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.700381 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e372223f-91ea-40f7-93f8-38bb0a08c646-metrics-tls\") pod \"dns-default-rjvxc\" (UID: \"e372223f-91ea-40f7-93f8-38bb0a08c646\") " pod="openshift-dns/dns-default-rjvxc" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.700403 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk8rv\" (UniqueName: \"kubernetes.io/projected/455fc602-83ab-4544-bfa2-01e0b35bc8dc-kube-api-access-jk8rv\") pod \"service-ca-9c57cc56f-5vcfl\" (UID: \"455fc602-83ab-4544-bfa2-01e0b35bc8dc\") " pod="openshift-service-ca/service-ca-9c57cc56f-5vcfl" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.700421 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b3cad9bd-0350-4262-86a0-cdb0f3a776ad-cert\") pod \"ingress-canary-dm4md\" (UID: \"b3cad9bd-0350-4262-86a0-cdb0f3a776ad\") " pod="openshift-ingress-canary/ingress-canary-dm4md" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.700444 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e372223f-91ea-40f7-93f8-38bb0a08c646-config-volume\") pod \"dns-default-rjvxc\" (UID: \"e372223f-91ea-40f7-93f8-38bb0a08c646\") " pod="openshift-dns/dns-default-rjvxc" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.700516 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.700558 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/081ebfd1-71a9-470a-8f73-9a673f6bcb9b-registration-dir\") pod \"csi-hostpathplugin-xl44c\" (UID: \"081ebfd1-71a9-470a-8f73-9a673f6bcb9b\") " pod="hostpath-provisioner/csi-hostpathplugin-xl44c" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.700576 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d1444448-daeb-4e89-8b0c-2c97127b00c2-apiservice-cert\") pod \"packageserver-d55dfcdfc-4vgtn\" (UID: \"d1444448-daeb-4e89-8b0c-2c97127b00c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vgtn" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.700597 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lnvt\" (UniqueName: \"kubernetes.io/projected/081ebfd1-71a9-470a-8f73-9a673f6bcb9b-kube-api-access-7lnvt\") pod \"csi-hostpathplugin-xl44c\" (UID: \"081ebfd1-71a9-470a-8f73-9a673f6bcb9b\") " pod="hostpath-provisioner/csi-hostpathplugin-xl44c" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.700637 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g4wm\" (UniqueName: \"kubernetes.io/projected/8a2327c4-0233-4973-927c-5b434e75ece4-kube-api-access-6g4wm\") pod \"machine-config-server-kbhp4\" (UID: \"8a2327c4-0233-4973-927c-5b434e75ece4\") " pod="openshift-machine-config-operator/machine-config-server-kbhp4" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.700679 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgllq\" (UniqueName: \"kubernetes.io/projected/e61b5d7e-db3b-49d0-94de-95a19c8fc89b-kube-api-access-kgllq\") pod \"service-ca-operator-777779d784-d5rp7\" (UID: \"e61b5d7e-db3b-49d0-94de-95a19c8fc89b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-d5rp7" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.700703 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e61b5d7e-db3b-49d0-94de-95a19c8fc89b-serving-cert\") pod \"service-ca-operator-777779d784-d5rp7\" (UID: \"e61b5d7e-db3b-49d0-94de-95a19c8fc89b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-d5rp7" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.700732 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/081ebfd1-71a9-470a-8f73-9a673f6bcb9b-socket-dir\") pod \"csi-hostpathplugin-xl44c\" (UID: \"081ebfd1-71a9-470a-8f73-9a673f6bcb9b\") " pod="hostpath-provisioner/csi-hostpathplugin-xl44c" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.700755 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e8dcbe6-373e-4c76-93fa-d30b75dd50db-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-dmrgx\" (UID: \"9e8dcbe6-373e-4c76-93fa-d30b75dd50db\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dmrgx" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.700780 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d1444448-daeb-4e89-8b0c-2c97127b00c2-webhook-cert\") pod \"packageserver-d55dfcdfc-4vgtn\" (UID: \"d1444448-daeb-4e89-8b0c-2c97127b00c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vgtn" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.700801 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1dcb36df-1b47-47d4-933c-24498112a4a6-secret-volume\") pod \"collect-profiles-29536890-gjpc2\" (UID: \"1dcb36df-1b47-47d4-933c-24498112a4a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536890-gjpc2" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.700824 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/081ebfd1-71a9-470a-8f73-9a673f6bcb9b-csi-data-dir\") pod \"csi-hostpathplugin-xl44c\" (UID: \"081ebfd1-71a9-470a-8f73-9a673f6bcb9b\") " pod="hostpath-provisioner/csi-hostpathplugin-xl44c" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.700876 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d54g\" (UniqueName: \"kubernetes.io/projected/1dcb36df-1b47-47d4-933c-24498112a4a6-kube-api-access-4d54g\") pod \"collect-profiles-29536890-gjpc2\" (UID: \"1dcb36df-1b47-47d4-933c-24498112a4a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536890-gjpc2" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.700893 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e61b5d7e-db3b-49d0-94de-95a19c8fc89b-config\") pod \"service-ca-operator-777779d784-d5rp7\" (UID: \"e61b5d7e-db3b-49d0-94de-95a19c8fc89b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-d5rp7" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.700914 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/455fc602-83ab-4544-bfa2-01e0b35bc8dc-signing-key\") pod \"service-ca-9c57cc56f-5vcfl\" (UID: \"455fc602-83ab-4544-bfa2-01e0b35bc8dc\") " pod="openshift-service-ca/service-ca-9c57cc56f-5vcfl" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.702194 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/081ebfd1-71a9-470a-8f73-9a673f6bcb9b-socket-dir\") pod \"csi-hostpathplugin-xl44c\" (UID: \"081ebfd1-71a9-470a-8f73-9a673f6bcb9b\") " pod="hostpath-provisioner/csi-hostpathplugin-xl44c" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.702330 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/856bbe2c-3e10-42fb-ac3c-470e0057bbaf-images\") pod \"machine-config-operator-74547568cd-vz9rd\" (UID: \"856bbe2c-3e10-42fb-ac3c-470e0057bbaf\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vz9rd" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.703805 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1dcb36df-1b47-47d4-933c-24498112a4a6-config-volume\") pod \"collect-profiles-29536890-gjpc2\" (UID: \"1dcb36df-1b47-47d4-933c-24498112a4a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536890-gjpc2" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.703915 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/081ebfd1-71a9-470a-8f73-9a673f6bcb9b-csi-data-dir\") pod \"csi-hostpathplugin-xl44c\" (UID: \"081ebfd1-71a9-470a-8f73-9a673f6bcb9b\") " pod="hostpath-provisioner/csi-hostpathplugin-xl44c" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.704074 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/081ebfd1-71a9-470a-8f73-9a673f6bcb9b-registration-dir\") pod \"csi-hostpathplugin-xl44c\" (UID: \"081ebfd1-71a9-470a-8f73-9a673f6bcb9b\") " pod="hostpath-provisioner/csi-hostpathplugin-xl44c" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.704361 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/081ebfd1-71a9-470a-8f73-9a673f6bcb9b-plugins-dir\") pod \"csi-hostpathplugin-xl44c\" (UID: \"081ebfd1-71a9-470a-8f73-9a673f6bcb9b\") " pod="hostpath-provisioner/csi-hostpathplugin-xl44c" Feb 27 17:39:05 crc kubenswrapper[4752]: E0227 17:39:05.704750 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:06.204739953 +0000 UTC m=+246.111556804 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.705546 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d1444448-daeb-4e89-8b0c-2c97127b00c2-apiservice-cert\") pod \"packageserver-d55dfcdfc-4vgtn\" (UID: \"d1444448-daeb-4e89-8b0c-2c97127b00c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vgtn" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.705852 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e372223f-91ea-40f7-93f8-38bb0a08c646-config-volume\") pod \"dns-default-rjvxc\" (UID: \"e372223f-91ea-40f7-93f8-38bb0a08c646\") " pod="openshift-dns/dns-default-rjvxc" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.705959 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9djm\" (UniqueName: \"kubernetes.io/projected/cc36acda-9447-479d-b741-c063ecb91f3e-kube-api-access-r9djm\") pod \"auto-csr-approver-29536898-598km\" (UID: \"cc36acda-9447-479d-b741-c063ecb91f3e\") " pod="openshift-infra/auto-csr-approver-29536898-598km" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.705981 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zz78\" (UniqueName: \"kubernetes.io/projected/506337c9-3cdd-451f-a015-6d3e25d43c22-kube-api-access-7zz78\") pod \"multus-admission-controller-857f4d67dd-hbfsp\" (UID: \"506337c9-3cdd-451f-a015-6d3e25d43c22\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hbfsp" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.706005 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvjbk\" (UniqueName: \"kubernetes.io/projected/856bbe2c-3e10-42fb-ac3c-470e0057bbaf-kube-api-access-nvjbk\") pod \"machine-config-operator-74547568cd-vz9rd\" (UID: \"856bbe2c-3e10-42fb-ac3c-470e0057bbaf\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vz9rd" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.706046 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8a2327c4-0233-4973-927c-5b434e75ece4-node-bootstrap-token\") pod \"machine-config-server-kbhp4\" (UID: \"8a2327c4-0233-4973-927c-5b434e75ece4\") " pod="openshift-machine-config-operator/machine-config-server-kbhp4" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.706073 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/856bbe2c-3e10-42fb-ac3c-470e0057bbaf-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vz9rd\" (UID: \"856bbe2c-3e10-42fb-ac3c-470e0057bbaf\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vz9rd" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.706098 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d1444448-daeb-4e89-8b0c-2c97127b00c2-tmpfs\") pod \"packageserver-d55dfcdfc-4vgtn\" (UID: \"d1444448-daeb-4e89-8b0c-2c97127b00c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vgtn" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.706114 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/455fc602-83ab-4544-bfa2-01e0b35bc8dc-signing-cabundle\") pod \"service-ca-9c57cc56f-5vcfl\" (UID: \"455fc602-83ab-4544-bfa2-01e0b35bc8dc\") " pod="openshift-service-ca/service-ca-9c57cc56f-5vcfl" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.706171 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb16b639-2f9c-414f-8cae-41f805a10165-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fcc4l\" (UID: \"bb16b639-2f9c-414f-8cae-41f805a10165\") " pod="openshift-marketplace/marketplace-operator-79b997595-fcc4l" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.706190 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bb16b639-2f9c-414f-8cae-41f805a10165-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fcc4l\" (UID: \"bb16b639-2f9c-414f-8cae-41f805a10165\") " pod="openshift-marketplace/marketplace-operator-79b997595-fcc4l" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.706214 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl84j\" (UniqueName: \"kubernetes.io/projected/d1444448-daeb-4e89-8b0c-2c97127b00c2-kube-api-access-fl84j\") pod \"packageserver-d55dfcdfc-4vgtn\" (UID: \"d1444448-daeb-4e89-8b0c-2c97127b00c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vgtn" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.706268 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/081ebfd1-71a9-470a-8f73-9a673f6bcb9b-mountpoint-dir\") pod \"csi-hostpathplugin-xl44c\" (UID: \"081ebfd1-71a9-470a-8f73-9a673f6bcb9b\") " pod="hostpath-provisioner/csi-hostpathplugin-xl44c" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.706297 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8a2327c4-0233-4973-927c-5b434e75ece4-certs\") pod \"machine-config-server-kbhp4\" (UID: \"8a2327c4-0233-4973-927c-5b434e75ece4\") " pod="openshift-machine-config-operator/machine-config-server-kbhp4" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.706952 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e372223f-91ea-40f7-93f8-38bb0a08c646-metrics-tls\") pod \"dns-default-rjvxc\" (UID: \"e372223f-91ea-40f7-93f8-38bb0a08c646\") " pod="openshift-dns/dns-default-rjvxc" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.709926 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-cfp2v"] Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.710063 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/506337c9-3cdd-451f-a015-6d3e25d43c22-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hbfsp\" (UID: \"506337c9-3cdd-451f-a015-6d3e25d43c22\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hbfsp" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.710247 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d1444448-daeb-4e89-8b0c-2c97127b00c2-tmpfs\") pod \"packageserver-d55dfcdfc-4vgtn\" (UID: \"d1444448-daeb-4e89-8b0c-2c97127b00c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vgtn" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.711161 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e8dcbe6-373e-4c76-93fa-d30b75dd50db-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-dmrgx\" (UID: \"9e8dcbe6-373e-4c76-93fa-d30b75dd50db\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dmrgx" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.711597 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1dcb36df-1b47-47d4-933c-24498112a4a6-secret-volume\") pod \"collect-profiles-29536890-gjpc2\" (UID: \"1dcb36df-1b47-47d4-933c-24498112a4a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536890-gjpc2" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.711603 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/455fc602-83ab-4544-bfa2-01e0b35bc8dc-signing-cabundle\") pod \"service-ca-9c57cc56f-5vcfl\" (UID: \"455fc602-83ab-4544-bfa2-01e0b35bc8dc\") " pod="openshift-service-ca/service-ca-9c57cc56f-5vcfl" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.712164 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/856bbe2c-3e10-42fb-ac3c-470e0057bbaf-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vz9rd\" (UID: \"856bbe2c-3e10-42fb-ac3c-470e0057bbaf\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vz9rd" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.712592 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb16b639-2f9c-414f-8cae-41f805a10165-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fcc4l\" (UID: \"bb16b639-2f9c-414f-8cae-41f805a10165\") " pod="openshift-marketplace/marketplace-operator-79b997595-fcc4l" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.712731 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/081ebfd1-71a9-470a-8f73-9a673f6bcb9b-mountpoint-dir\") pod \"csi-hostpathplugin-xl44c\" (UID: \"081ebfd1-71a9-470a-8f73-9a673f6bcb9b\") " pod="hostpath-provisioner/csi-hostpathplugin-xl44c" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.714365 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8a2327c4-0233-4973-927c-5b434e75ece4-certs\") pod \"machine-config-server-kbhp4\" (UID: \"8a2327c4-0233-4973-927c-5b434e75ece4\") " pod="openshift-machine-config-operator/machine-config-server-kbhp4" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.714461 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/455fc602-83ab-4544-bfa2-01e0b35bc8dc-signing-key\") pod \"service-ca-9c57cc56f-5vcfl\" (UID: \"455fc602-83ab-4544-bfa2-01e0b35bc8dc\") " pod="openshift-service-ca/service-ca-9c57cc56f-5vcfl" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.714556 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e61b5d7e-db3b-49d0-94de-95a19c8fc89b-serving-cert\") pod \"service-ca-operator-777779d784-d5rp7\" (UID: \"e61b5d7e-db3b-49d0-94de-95a19c8fc89b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-d5rp7" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.714783 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d1444448-daeb-4e89-8b0c-2c97127b00c2-webhook-cert\") pod \"packageserver-d55dfcdfc-4vgtn\" (UID: \"d1444448-daeb-4e89-8b0c-2c97127b00c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vgtn" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.715396 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8a2327c4-0233-4973-927c-5b434e75ece4-node-bootstrap-token\") pod \"machine-config-server-kbhp4\" (UID: \"8a2327c4-0233-4973-927c-5b434e75ece4\") " pod="openshift-machine-config-operator/machine-config-server-kbhp4" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.717684 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bb16b639-2f9c-414f-8cae-41f805a10165-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fcc4l\" (UID: \"bb16b639-2f9c-414f-8cae-41f805a10165\") " pod="openshift-marketplace/marketplace-operator-79b997595-fcc4l" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.717852 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b3cad9bd-0350-4262-86a0-cdb0f3a776ad-cert\") pod \"ingress-canary-dm4md\" (UID: \"b3cad9bd-0350-4262-86a0-cdb0f3a776ad\") " pod="openshift-ingress-canary/ingress-canary-dm4md" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.719480 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/856bbe2c-3e10-42fb-ac3c-470e0057bbaf-proxy-tls\") pod \"machine-config-operator-74547568cd-vz9rd\" (UID: \"856bbe2c-3e10-42fb-ac3c-470e0057bbaf\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vz9rd" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.726874 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lcgcn" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.728216 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfv7k\" (UniqueName: \"kubernetes.io/projected/ae0e3048-c296-4078-a2da-4f630f3e01bc-kube-api-access-lfv7k\") pod \"olm-operator-6b444d44fb-9s4n4\" (UID: \"ae0e3048-c296-4078-a2da-4f630f3e01bc\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9s4n4" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.732569 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b07dbba0-06fa-4b50-9ce7-76f943a9a355-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qxgv7\" (UID: \"b07dbba0-06fa-4b50-9ce7-76f943a9a355\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qxgv7" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.751547 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79g5p\" (UniqueName: \"kubernetes.io/projected/7d6b9fa5-61ff-425f-b41a-5dbf2f3f7efa-kube-api-access-79g5p\") pod \"ingress-operator-5b745b69d9-jqd62\" (UID: \"7d6b9fa5-61ff-425f-b41a-5dbf2f3f7efa\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqd62" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.758299 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m24gb" Feb 27 17:39:05 crc kubenswrapper[4752]: W0227 17:39:05.760720 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4eceb80_1269_438b_ad35_1a125e8b98c9.slice/crio-f58dc7a06e5a7d3edddaa90465cd22e4024abbd331401f4132095d3573fa298d WatchSource:0}: Error finding container f58dc7a06e5a7d3edddaa90465cd22e4024abbd331401f4132095d3573fa298d: Status 404 returned error can't find the container with id f58dc7a06e5a7d3edddaa90465cd22e4024abbd331401f4132095d3573fa298d Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.787946 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-lhljv"] Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.792438 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2b7f\" (UniqueName: \"kubernetes.io/projected/bf9d110c-49da-44fc-b366-308548c190ad-kube-api-access-f2b7f\") pod \"machine-config-controller-84d6567774-4psjf\" (UID: \"bf9d110c-49da-44fc-b366-308548c190ad\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4psjf" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.797857 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dvlkw"] Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.808067 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:05 crc kubenswrapper[4752]: E0227 17:39:05.808956 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:06.308939886 +0000 UTC m=+246.215756727 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.810000 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a3047a3c-0344-4f73-a4d1-5f2c278fa1b8-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-lps47\" (UID: \"a3047a3c-0344-4f73-a4d1-5f2c278fa1b8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lps47" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.818783 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djmg2\" (UniqueName: \"kubernetes.io/projected/5992a5c1-ae6c-4bae-98b4-ad86a24b2a4e-kube-api-access-djmg2\") pod \"control-plane-machine-set-operator-78cbb6b69f-5pwkv\" (UID: \"5992a5c1-ae6c-4bae-98b4-ad86a24b2a4e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pwkv" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.839435 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqd62" Feb 27 17:39:05 crc kubenswrapper[4752]: W0227 17:39:05.839840 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89afffa7_80af_4d36_9f60_c79ad00c737f.slice/crio-0064b3384a4832d6650dfa635ca2b0a382a792a13223420db2dc4bc51541916f WatchSource:0}: Error finding container 0064b3384a4832d6650dfa635ca2b0a382a792a13223420db2dc4bc51541916f: Status 404 returned error can't find the container with id 0064b3384a4832d6650dfa635ca2b0a382a792a13223420db2dc4bc51541916f Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.845189 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9s4n4" Feb 27 17:39:05 crc kubenswrapper[4752]: W0227 17:39:05.850247 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80cf94de_a056_4243_9ade_775eea192f3f.slice/crio-8df857a560e41da0efc68992bc09b6c340632ea2765eb32116c6bd39849dcabf WatchSource:0}: Error finding container 8df857a560e41da0efc68992bc09b6c340632ea2765eb32116c6bd39849dcabf: Status 404 returned error can't find the container with id 8df857a560e41da0efc68992bc09b6c340632ea2765eb32116c6bd39849dcabf Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.851954 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmjmz\" (UniqueName: \"kubernetes.io/projected/b86e38ee-36f1-4bf4-86c0-3b13b1d95103-kube-api-access-lmjmz\") pod \"migrator-59844c95c7-s8s94\" (UID: \"b86e38ee-36f1-4bf4-86c0-3b13b1d95103\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s8s94" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.879357 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmqkl\" (UniqueName: \"kubernetes.io/projected/9e8dcbe6-373e-4c76-93fa-d30b75dd50db-kube-api-access-rmqkl\") pod \"package-server-manager-789f6589d5-dmrgx\" (UID: \"9e8dcbe6-373e-4c76-93fa-d30b75dd50db\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dmrgx" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.914763 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:05 crc kubenswrapper[4752]: E0227 17:39:05.915646 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:06.415629411 +0000 UTC m=+246.322446262 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.919851 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x5tb9"] Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.919925 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-rxzmm"] Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.931366 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4vxw" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.936757 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8r7pq"] Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.954266 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgllq\" (UniqueName: \"kubernetes.io/projected/e61b5d7e-db3b-49d0-94de-95a19c8fc89b-kube-api-access-kgllq\") pod \"service-ca-operator-777779d784-d5rp7\" (UID: \"e61b5d7e-db3b-49d0-94de-95a19c8fc89b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-d5rp7" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.983576 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g4wm\" (UniqueName: \"kubernetes.io/projected/8a2327c4-0233-4973-927c-5b434e75ece4-kube-api-access-6g4wm\") pod \"machine-config-server-kbhp4\" (UID: \"8a2327c4-0233-4973-927c-5b434e75ece4\") " pod="openshift-machine-config-operator/machine-config-server-kbhp4" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.991822 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d592b\" (UniqueName: \"kubernetes.io/projected/bb16b639-2f9c-414f-8cae-41f805a10165-kube-api-access-d592b\") pod \"marketplace-operator-79b997595-fcc4l\" (UID: \"bb16b639-2f9c-414f-8cae-41f805a10165\") " pod="openshift-marketplace/marketplace-operator-79b997595-fcc4l" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.998511 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj9ts\" (UniqueName: \"kubernetes.io/projected/e372223f-91ea-40f7-93f8-38bb0a08c646-kube-api-access-dj9ts\") pod \"dns-default-rjvxc\" (UID: \"e372223f-91ea-40f7-93f8-38bb0a08c646\") " pod="openshift-dns/dns-default-rjvxc" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.998520 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lnvt\" (UniqueName: \"kubernetes.io/projected/081ebfd1-71a9-470a-8f73-9a673f6bcb9b-kube-api-access-7lnvt\") pod \"csi-hostpathplugin-xl44c\" (UID: \"081ebfd1-71a9-470a-8f73-9a673f6bcb9b\") " pod="hostpath-provisioner/csi-hostpathplugin-xl44c" Feb 27 17:39:05 crc kubenswrapper[4752]: I0227 17:39:05.998756 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjlc7\" (UniqueName: \"kubernetes.io/projected/b3cad9bd-0350-4262-86a0-cdb0f3a776ad-kube-api-access-cjlc7\") pod \"ingress-canary-dm4md\" (UID: \"b3cad9bd-0350-4262-86a0-cdb0f3a776ad\") " pod="openshift-ingress-canary/ingress-canary-dm4md" Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.002169 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tbqcp"] Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.016065 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:06 crc kubenswrapper[4752]: E0227 17:39:06.016325 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:06.516306386 +0000 UTC m=+246.423123227 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.016447 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:06 crc kubenswrapper[4752]: E0227 17:39:06.016749 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:06.516739527 +0000 UTC m=+246.423556378 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.017131 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk8rv\" (UniqueName: \"kubernetes.io/projected/455fc602-83ab-4544-bfa2-01e0b35bc8dc-kube-api-access-jk8rv\") pod \"service-ca-9c57cc56f-5vcfl\" (UID: \"455fc602-83ab-4544-bfa2-01e0b35bc8dc\") " pod="openshift-service-ca/service-ca-9c57cc56f-5vcfl" Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.032382 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qxgv7" Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.039845 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lps47" Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.041271 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d54g\" (UniqueName: \"kubernetes.io/projected/1dcb36df-1b47-47d4-933c-24498112a4a6-kube-api-access-4d54g\") pod \"collect-profiles-29536890-gjpc2\" (UID: \"1dcb36df-1b47-47d4-933c-24498112a4a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536890-gjpc2" Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.046479 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pwkv" Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.049570 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lcgcn"] Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.054133 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9djm\" (UniqueName: \"kubernetes.io/projected/cc36acda-9447-479d-b741-c063ecb91f3e-kube-api-access-r9djm\") pod \"auto-csr-approver-29536898-598km\" (UID: \"cc36acda-9447-479d-b741-c063ecb91f3e\") " pod="openshift-infra/auto-csr-approver-29536898-598km" Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.066051 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4psjf" Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.074720 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dmrgx" Feb 27 17:39:06 crc kubenswrapper[4752]: W0227 17:39:06.080472 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d737ed6_90b8_4607_bf34_a21992a704e6.slice/crio-44aa57aaff8d2d8848424b72e67248a1b20d4242a372e7255ce375b9b7e089de WatchSource:0}: Error finding container 44aa57aaff8d2d8848424b72e67248a1b20d4242a372e7255ce375b9b7e089de: Status 404 returned error can't find the container with id 44aa57aaff8d2d8848424b72e67248a1b20d4242a372e7255ce375b9b7e089de Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.082565 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s8s94" Feb 27 17:39:06 crc kubenswrapper[4752]: W0227 17:39:06.082944 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc4c8c40_7b60_44c4_a1f4_e7c2ed84035b.slice/crio-ea61c46b1177ef7cedb466875f13f81d943ace62d56710820e64c26e67497df3 WatchSource:0}: Error finding container ea61c46b1177ef7cedb466875f13f81d943ace62d56710820e64c26e67497df3: Status 404 returned error can't find the container with id ea61c46b1177ef7cedb466875f13f81d943ace62d56710820e64c26e67497df3 Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.091682 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fcc4l" Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.092304 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zz78\" (UniqueName: \"kubernetes.io/projected/506337c9-3cdd-451f-a015-6d3e25d43c22-kube-api-access-7zz78\") pod \"multus-admission-controller-857f4d67dd-hbfsp\" (UID: \"506337c9-3cdd-451f-a015-6d3e25d43c22\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hbfsp" Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.094458 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvjbk\" (UniqueName: \"kubernetes.io/projected/856bbe2c-3e10-42fb-ac3c-470e0057bbaf-kube-api-access-nvjbk\") pod \"machine-config-operator-74547568cd-vz9rd\" (UID: \"856bbe2c-3e10-42fb-ac3c-470e0057bbaf\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vz9rd" Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.106687 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-d5rp7" Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.119303 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:06 crc kubenswrapper[4752]: E0227 17:39:06.119941 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:06.619919235 +0000 UTC m=+246.526736096 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.120049 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-5vcfl" Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.121547 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl84j\" (UniqueName: \"kubernetes.io/projected/d1444448-daeb-4e89-8b0c-2c97127b00c2-kube-api-access-fl84j\") pod \"packageserver-d55dfcdfc-4vgtn\" (UID: \"d1444448-daeb-4e89-8b0c-2c97127b00c2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vgtn" Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.125660 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536890-gjpc2" Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.127888 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m24gb"] Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.132714 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-hbfsp" Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.156717 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536898-598km" Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.161969 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vz9rd" Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.169886 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dm4md" Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.178765 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-xl44c" Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.202439 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rjvxc" Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.207700 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-kbhp4" Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.221319 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:06 crc kubenswrapper[4752]: E0227 17:39:06.221906 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:06.721872462 +0000 UTC m=+246.628689333 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.239549 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9s4n4"] Feb 27 17:39:06 crc kubenswrapper[4752]: E0227 17:39:06.323513 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:06.823481651 +0000 UTC m=+246.730298502 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.323939 4752 patch_prober.go:28] interesting pod/machine-config-daemon-cm8wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.324017 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.322951 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.328596 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:06 crc kubenswrapper[4752]: E0227 17:39:06.332076 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:06.832047163 +0000 UTC m=+246.738864014 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.389656 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.409429 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vgtn" Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.433199 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:06 crc kubenswrapper[4752]: E0227 17:39:06.433420 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:06.933384835 +0000 UTC m=+246.840201686 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.433648 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:06 crc kubenswrapper[4752]: E0227 17:39:06.434069 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:06.934043431 +0000 UTC m=+246.840860282 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.450027 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qxgv7"] Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.450084 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-4psjf"] Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.541050 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:06 crc kubenswrapper[4752]: E0227 17:39:06.556245 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:07.056205288 +0000 UTC m=+246.963022139 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.574560 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccqh7" event={"ID":"fdb18d1a-6b47-4b81-808a-f6458470a201","Type":"ContainerStarted","Data":"9d4cd462ebb3d2cf4c7161f0f4a81396323b9dc3ced909a4f2f57ad726abba59"} Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.597908 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-tjktn" event={"ID":"72a3daf3-ca59-4211-9195-1b5c70e4de7c","Type":"ContainerStarted","Data":"f81e8ce8a9927a0ab2e00458aa69f0cb456c4bb8fccc997143bb41d5ff6e261f"} Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.597980 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-tjktn" event={"ID":"72a3daf3-ca59-4211-9195-1b5c70e4de7c","Type":"ContainerStarted","Data":"1d63829d9c58160a180a15b79eacfc73ef2988c1ec29e8adcd6a57d2f80b5b2f"} Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.610250 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-jqd62"] Feb 27 17:39:06 crc kubenswrapper[4752]: W0227 17:39:06.619381 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb07dbba0_06fa_4b50_9ce7_76f943a9a355.slice/crio-289e64bd5e00edbceb8dd173b98c2547e91bde8a17bbf43fb8774470f801a0a6 WatchSource:0}: Error finding container 289e64bd5e00edbceb8dd173b98c2547e91bde8a17bbf43fb8774470f801a0a6: Status 404 returned error can't find the container with id 289e64bd5e00edbceb8dd173b98c2547e91bde8a17bbf43fb8774470f801a0a6 Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.628739 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" event={"ID":"0480be6e-a859-4bd7-8aad-0a7e5bf06a0e","Type":"ContainerStarted","Data":"7feef7a0c2f22c1b8bf0434fdd42c487797aa324c4a15962502319c1c9b9b455"} Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.642737 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:06 crc kubenswrapper[4752]: E0227 17:39:06.643497 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:07.143482383 +0000 UTC m=+247.050299234 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.656777 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mnfsk" event={"ID":"52c3bbf7-f787-4b3f-8028-cdee09aba43e","Type":"ContainerStarted","Data":"3fd58ffa348446220d909414e45f17788b9d257fe72c192fe0ccb6951aa3f850"} Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.687567 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m24gb" event={"ID":"020f54a0-e34f-46bf-8ec0-56da0ea1a8f8","Type":"ContainerStarted","Data":"8e3d73210fb9185a70be80b12d571e39dc093f2be75f19f99de698ef61494358"} Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.687903 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fcc4l"] Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.735867 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hbfsp"] Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.746970 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:06 crc kubenswrapper[4752]: E0227 17:39:06.752590 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:07.252559876 +0000 UTC m=+247.159376727 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.804363 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pwkv"] Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.815302 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-krm5q" event={"ID":"127ebcb0-f31e-4857-9a67-842057dd7df4","Type":"ContainerStarted","Data":"3b5398ac0a831149c119af7684a0730aa4848d70cb55a1ca7df381d6ebe9b1b6"} Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.815348 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-krm5q" event={"ID":"127ebcb0-f31e-4857-9a67-842057dd7df4","Type":"ContainerStarted","Data":"b11a85ce9f87e606d428e24722b9c6e6d2b9ccbc74d9c554afe1d110583ec71b"} Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.832033 4752 generic.go:334] "Generic (PLEG): container finished" podID="582c125f-cc05-442d-9bc0-0e588b1dc998" containerID="2481b42922c0f293032714520294188f058cd147597a79f1de9f878293e973d8" exitCode=0 Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.832122 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fb9f6" event={"ID":"582c125f-cc05-442d-9bc0-0e588b1dc998","Type":"ContainerDied","Data":"2481b42922c0f293032714520294188f058cd147597a79f1de9f878293e973d8"} Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.844790 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lcgcn" event={"ID":"f0bcb3ab-f03a-410d-911e-baffe86632c2","Type":"ContainerStarted","Data":"64685c518b29aa87092cf07a7ba5ed5bb0b228f6e91c3926f31a8b2fa394061d"} Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.848703 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-jsq6c" event={"ID":"f3b9cef1-7930-44bf-9bc7-5e28f8282e4e","Type":"ContainerStarted","Data":"c5ca6ed455198ef5909780c89a5b0aaf4ef35773ab72a5f3b628fa8bcfe089e5"} Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.848747 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-jsq6c" event={"ID":"f3b9cef1-7930-44bf-9bc7-5e28f8282e4e","Type":"ContainerStarted","Data":"b7226546c5ee8592e7112f2d5cfffac93ab1911d1842b69b34a5938024bce34c"} Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.849195 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:06 crc kubenswrapper[4752]: E0227 17:39:06.849697 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:07.349680804 +0000 UTC m=+247.256497655 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.872237 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-dvlkw" event={"ID":"80cf94de-a056-4243-9ade-775eea192f3f","Type":"ContainerStarted","Data":"8df857a560e41da0efc68992bc09b6c340632ea2765eb32116c6bd39849dcabf"} Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.880071 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9s4n4" event={"ID":"ae0e3048-c296-4078-a2da-4f630f3e01bc","Type":"ContainerStarted","Data":"48ac455ff14c063a12f5194859f766b71d9b3fc0c04e2ec2781cc1a5bcfd62b6"} Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.942490 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-lhljv" Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.943087 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" event={"ID":"02863d54-8b48-4358-8dfe-b43269b1da31","Type":"ContainerStarted","Data":"45cb998762850c7c93f48207a452b65e58961e1752d31a6eef93626437a10b6f"} Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.943155 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-lhljv" event={"ID":"89afffa7-80af-4d36-9f60-c79ad00c737f","Type":"ContainerStarted","Data":"0064b3384a4832d6650dfa635ca2b0a382a792a13223420db2dc4bc51541916f"} Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.953952 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:06 crc kubenswrapper[4752]: E0227 17:39:06.955170 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:07.455154238 +0000 UTC m=+247.361971079 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.982033 4752 patch_prober.go:28] interesting pod/console-operator-58897d9998-lhljv container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.982107 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-lhljv" podUID="89afffa7-80af-4d36-9f60-c79ad00c737f" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.998515 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4r7b2" event={"ID":"a4eceb80-1269-438b-ad35-1a125e8b98c9","Type":"ContainerStarted","Data":"93b7ffcb3307b5708f7b9a200f275ed8a6720c8f280c65a117fffaa447b000a1"} Feb 27 17:39:06 crc kubenswrapper[4752]: I0227 17:39:06.998573 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4r7b2" event={"ID":"a4eceb80-1269-438b-ad35-1a125e8b98c9","Type":"ContainerStarted","Data":"f58dc7a06e5a7d3edddaa90465cd22e4024abbd331401f4132095d3573fa298d"} Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.025058 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tbqcp" event={"ID":"bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b","Type":"ContainerStarted","Data":"ea61c46b1177ef7cedb466875f13f81d943ace62d56710820e64c26e67497df3"} Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.037730 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536890-gjpc2"] Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.045836 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x5tb9" event={"ID":"8dc5b308-08ce-4729-b854-d91947b6fce5","Type":"ContainerStarted","Data":"051e2a96bd7fb4ec76e61773105751471f2c0663d789f11d464fbb575864493d"} Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.059916 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-rxzmm" event={"ID":"7d737ed6-90b8-4607-bf34-a21992a704e6","Type":"ContainerStarted","Data":"44aa57aaff8d2d8848424b72e67248a1b20d4242a372e7255ce375b9b7e089de"} Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.076860 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:07 crc kubenswrapper[4752]: E0227 17:39:07.079054 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:07.579041728 +0000 UTC m=+247.485858579 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.083595 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-cfp2v" event={"ID":"3083b21b-220e-4439-a3c1-18c79f073151","Type":"ContainerStarted","Data":"17544283e1c30243004ca1df48d5e414a342a812c5f997d7d22e007c61d93e18"} Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.083668 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-cfp2v" event={"ID":"3083b21b-220e-4439-a3c1-18c79f073151","Type":"ContainerStarted","Data":"2df3ca4edc468e1ad6a056dcbb38f248d4b883df6f0ebb1473373b833cb8e69f"} Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.088902 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-zj6td" podStartSLOduration=191.088890251 podStartE2EDuration="3m11.088890251s" podCreationTimestamp="2026-02-27 17:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:07.088699996 +0000 UTC m=+246.995516857" watchObservedRunningTime="2026-02-27 17:39:07.088890251 +0000 UTC m=+246.995707102" Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.159431 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-xl44c"] Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.163657 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-g96w7" podStartSLOduration=191.163638816 podStartE2EDuration="3m11.163638816s" podCreationTimestamp="2026-02-27 17:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:07.160607132 +0000 UTC m=+247.067423983" watchObservedRunningTime="2026-02-27 17:39:07.163638816 +0000 UTC m=+247.070455667" Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.178364 4752 ???:1] "http: TLS handshake error from 192.168.126.11:35262: no serving certificate available for the kubelet" Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.182828 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:07 crc kubenswrapper[4752]: E0227 17:39:07.184627 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:07.684597354 +0000 UTC m=+247.591414205 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.230307 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mnfsk" podStartSLOduration=191.230280662 podStartE2EDuration="3m11.230280662s" podCreationTimestamp="2026-02-27 17:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:07.229131564 +0000 UTC m=+247.135948415" watchObservedRunningTime="2026-02-27 17:39:07.230280662 +0000 UTC m=+247.137097513" Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.278342 4752 ???:1] "http: TLS handshake error from 192.168.126.11:35276: no serving certificate available for the kubelet" Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.302383 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:07 crc kubenswrapper[4752]: E0227 17:39:07.302773 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:07.802756411 +0000 UTC m=+247.709573262 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.384691 4752 ???:1] "http: TLS handshake error from 192.168.126.11:35284: no serving certificate available for the kubelet" Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.404927 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:07 crc kubenswrapper[4752]: E0227 17:39:07.405592 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:07.905571629 +0000 UTC m=+247.812388480 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.414912 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-jsq6c" Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.419597 4752 patch_prober.go:28] interesting pod/router-default-5444994796-jsq6c container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 17:39:07 crc kubenswrapper[4752]: [-]has-synced failed: reason withheld Feb 27 17:39:07 crc kubenswrapper[4752]: [+]process-running ok Feb 27 17:39:07 crc kubenswrapper[4752]: healthz check failed Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.419655 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jsq6c" podUID="f3b9cef1-7930-44bf-9bc7-5e28f8282e4e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.445252 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-jsq6c" podStartSLOduration=191.445216988 podStartE2EDuration="3m11.445216988s" podCreationTimestamp="2026-02-27 17:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:07.402734639 +0000 UTC m=+247.309551490" watchObservedRunningTime="2026-02-27 17:39:07.445216988 +0000 UTC m=+247.352033839" Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.484163 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536898-598km"] Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.494362 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4r7b2" podStartSLOduration=191.494340101 podStartE2EDuration="3m11.494340101s" podCreationTimestamp="2026-02-27 17:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:07.479511385 +0000 UTC m=+247.386328236" watchObservedRunningTime="2026-02-27 17:39:07.494340101 +0000 UTC m=+247.401156952" Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.498553 4752 ???:1] "http: TLS handshake error from 192.168.126.11:35298: no serving certificate available for the kubelet" Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.508590 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:07 crc kubenswrapper[4752]: E0227 17:39:07.509176 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:08.009159417 +0000 UTC m=+247.915976268 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.513434 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.587135 4752 ???:1] "http: TLS handshake error from 192.168.126.11:35312: no serving certificate available for the kubelet" Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.595425 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q42ms" podStartSLOduration=191.595408277 podStartE2EDuration="3m11.595408277s" podCreationTimestamp="2026-02-27 17:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:07.550880127 +0000 UTC m=+247.457696978" watchObservedRunningTime="2026-02-27 17:39:07.595408277 +0000 UTC m=+247.502225128" Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.614591 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:07 crc kubenswrapper[4752]: E0227 17:39:07.615109 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:08.115087983 +0000 UTC m=+248.021904834 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.692795 4752 ???:1] "http: TLS handshake error from 192.168.126.11:35314: no serving certificate available for the kubelet" Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.716943 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-lhljv" podStartSLOduration=191.716923747 podStartE2EDuration="3m11.716923747s" podCreationTimestamp="2026-02-27 17:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:07.716582599 +0000 UTC m=+247.623399470" watchObservedRunningTime="2026-02-27 17:39:07.716923747 +0000 UTC m=+247.623740598" Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.718646 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:07 crc kubenswrapper[4752]: E0227 17:39:07.719170 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:08.219132082 +0000 UTC m=+248.125948943 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.824546 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:07 crc kubenswrapper[4752]: E0227 17:39:07.825113 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:08.325095038 +0000 UTC m=+248.231911889 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.823129 4752 ???:1] "http: TLS handshake error from 192.168.126.11:35316: no serving certificate available for the kubelet" Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.831501 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5vcfl"] Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.846391 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-d5rp7"] Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.851271 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rjvxc"] Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.862124 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lps47"] Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.868703 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccqh7" podStartSLOduration=190.866753107 podStartE2EDuration="3m10.866753107s" podCreationTimestamp="2026-02-27 17:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:07.824184456 +0000 UTC m=+247.731001307" watchObservedRunningTime="2026-02-27 17:39:07.866753107 +0000 UTC m=+247.773569958" Feb 27 17:39:07 crc kubenswrapper[4752]: W0227 17:39:07.871575 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod455fc602_83ab_4544_bfa2_01e0b35bc8dc.slice/crio-2d8b40f98095f3b05c18361f347f8a2874d0a9e3af9a646b13227fea4f044f6b WatchSource:0}: Error finding container 2d8b40f98095f3b05c18361f347f8a2874d0a9e3af9a646b13227fea4f044f6b: Status 404 returned error can't find the container with id 2d8b40f98095f3b05c18361f347f8a2874d0a9e3af9a646b13227fea4f044f6b Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.888651 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4vxw" podStartSLOduration=190.888630937 podStartE2EDuration="3m10.888630937s" podCreationTimestamp="2026-02-27 17:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:07.871996136 +0000 UTC m=+247.778812987" watchObservedRunningTime="2026-02-27 17:39:07.888630937 +0000 UTC m=+247.795447788" Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.889127 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dm4md"] Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.921385 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-dvlkw" podStartSLOduration=191.921357415 podStartE2EDuration="3m11.921357415s" podCreationTimestamp="2026-02-27 17:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:07.918751121 +0000 UTC m=+247.825568152" watchObservedRunningTime="2026-02-27 17:39:07.921357415 +0000 UTC m=+247.828174266" Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.924297 4752 ???:1] "http: TLS handshake error from 192.168.126.11:35326: no serving certificate available for the kubelet" Feb 27 17:39:07 crc kubenswrapper[4752]: I0227 17:39:07.926638 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:07 crc kubenswrapper[4752]: E0227 17:39:07.927109 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:08.427093057 +0000 UTC m=+248.333909908 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.027669 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:08 crc kubenswrapper[4752]: E0227 17:39:08.028049 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:08.528032579 +0000 UTC m=+248.434849420 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.054428 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lfrdd" podStartSLOduration=192.05440926 podStartE2EDuration="3m12.05440926s" podCreationTimestamp="2026-02-27 17:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:08.014433723 +0000 UTC m=+247.921250564" watchObservedRunningTime="2026-02-27 17:39:08.05440926 +0000 UTC m=+247.961226111" Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.056755 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vz9rd"] Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.081757 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-s8s94"] Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.127613 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dmrgx"] Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.129372 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:08 crc kubenswrapper[4752]: E0227 17:39:08.129805 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:08.629789292 +0000 UTC m=+248.536606143 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.136309 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hbfsp" event={"ID":"506337c9-3cdd-451f-a015-6d3e25d43c22","Type":"ContainerStarted","Data":"b4218bfd84c9718b7d47287b37869583351ff8175636efbd815f4f7211dd9e40"} Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.138883 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lps47" event={"ID":"a3047a3c-0344-4f73-a4d1-5f2c278fa1b8","Type":"ContainerStarted","Data":"e2cf183cb9da9ad5016c5b366b781b246f08652af3dbdac9cc4c7c9ef0e2d326"} Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.146246 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dm4md" event={"ID":"b3cad9bd-0350-4262-86a0-cdb0f3a776ad","Type":"ContainerStarted","Data":"6023d0a53055677b7406093b0ae9f5d29001fc3c5eab6b969a84b3f92fe9c581"} Feb 27 17:39:08 crc kubenswrapper[4752]: W0227 17:39:08.195042 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb86e38ee_36f1_4bf4_86c0_3b13b1d95103.slice/crio-9c74dd548459f4b6af5a3e532052e262bc0ce1f4675d9cf91272c3c6d4b4905f WatchSource:0}: Error finding container 9c74dd548459f4b6af5a3e532052e262bc0ce1f4675d9cf91272c3c6d4b4905f: Status 404 returned error can't find the container with id 9c74dd548459f4b6af5a3e532052e262bc0ce1f4675d9cf91272c3c6d4b4905f Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.207489 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fcc4l" event={"ID":"bb16b639-2f9c-414f-8cae-41f805a10165","Type":"ContainerStarted","Data":"119ea4121e14069301ef9da8681aafac5477338f17d5d477e4dada537b14306d"} Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.207560 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fcc4l" event={"ID":"bb16b639-2f9c-414f-8cae-41f805a10165","Type":"ContainerStarted","Data":"30481aba07e69fc3e26033b104d168a34c5b72a1e529fcab507ccc491b19f3f5"} Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.209164 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-fcc4l" Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.210724 4752 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-fcc4l container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.210810 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-fcc4l" podUID="bb16b639-2f9c-414f-8cae-41f805a10165" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.215382 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-d5rp7" event={"ID":"e61b5d7e-db3b-49d0-94de-95a19c8fc89b","Type":"ContainerStarted","Data":"0611eaf91f7ad95c6ece6941d2817ec03c2563add5a777ed9d7d0685fa85afba"} Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.238285 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqd62" event={"ID":"7d6b9fa5-61ff-425f-b41a-5dbf2f3f7efa","Type":"ContainerStarted","Data":"7da3a57a9a19e0c008c974e4bff24e1525ba2602635526251e9a61f964b749b6"} Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.238335 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqd62" event={"ID":"7d6b9fa5-61ff-425f-b41a-5dbf2f3f7efa","Type":"ContainerStarted","Data":"8e864b5b509bc45abec319b93542b10fd8a643696a5ca1978818c6a597ecde52"} Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.240952 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:08 crc kubenswrapper[4752]: E0227 17:39:08.241920 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:08.74189244 +0000 UTC m=+248.648709291 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.252478 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-cfp2v" podStartSLOduration=192.25245359 podStartE2EDuration="3m12.25245359s" podCreationTimestamp="2026-02-27 17:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:08.205964253 +0000 UTC m=+248.112781104" watchObservedRunningTime="2026-02-27 17:39:08.25245359 +0000 UTC m=+248.159270431" Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.265421 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-fcc4l" podStartSLOduration=191.26539723 podStartE2EDuration="3m11.26539723s" podCreationTimestamp="2026-02-27 17:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:08.252578874 +0000 UTC m=+248.159395745" watchObservedRunningTime="2026-02-27 17:39:08.26539723 +0000 UTC m=+248.172214091" Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.271945 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m24gb" event={"ID":"020f54a0-e34f-46bf-8ec0-56da0ea1a8f8","Type":"ContainerStarted","Data":"17dbdb45436cb48147e03cea78892ed0f1c261a3790bf52c8f29df152d823b2c"} Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.278477 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m24gb" Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.299456 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vgtn"] Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.309413 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" event={"ID":"02863d54-8b48-4358-8dfe-b43269b1da31","Type":"ContainerStarted","Data":"7cbe56d71fe50b73632fd26e4043862fc69a76c95cc6f3c35d7d72e1309f2c57"} Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.311018 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.338873 4752 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-8r7pq container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" start-of-body= Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.338920 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" podUID="02863d54-8b48-4358-8dfe-b43269b1da31" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.342625 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m24gb" podStartSLOduration=191.342591036 podStartE2EDuration="3m11.342591036s" podCreationTimestamp="2026-02-27 17:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:08.34073266 +0000 UTC m=+248.247549511" watchObservedRunningTime="2026-02-27 17:39:08.342591036 +0000 UTC m=+248.249407887" Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.346770 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:08 crc kubenswrapper[4752]: E0227 17:39:08.347286 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:08.847269972 +0000 UTC m=+248.754086823 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.347961 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m24gb" Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.384503 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rjvxc" event={"ID":"e372223f-91ea-40f7-93f8-38bb0a08c646","Type":"ContainerStarted","Data":"7e1a6e8940b170e4d769f9cdeec17a793a04d331eba4620177b21157fa4b3e41"} Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.413695 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-dvlkw" event={"ID":"80cf94de-a056-4243-9ade-775eea192f3f","Type":"ContainerStarted","Data":"899d578cf318f37b74a521b7a60fda38d8f5f9c479ed18d7741196893dcdf625"} Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.419752 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29536890-gjpc2" event={"ID":"1dcb36df-1b47-47d4-933c-24498112a4a6","Type":"ContainerStarted","Data":"f901752c73e458d6369375c0efe49bea79c65595c6ea8362e4a99248c2d872e5"} Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.421323 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" podStartSLOduration=192.42130212 podStartE2EDuration="3m12.42130212s" podCreationTimestamp="2026-02-27 17:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:08.384705786 +0000 UTC m=+248.291522827" watchObservedRunningTime="2026-02-27 17:39:08.42130212 +0000 UTC m=+248.328118971" Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.422896 4752 patch_prober.go:28] interesting pod/router-default-5444994796-jsq6c container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 17:39:08 crc kubenswrapper[4752]: [-]has-synced failed: reason withheld Feb 27 17:39:08 crc kubenswrapper[4752]: [+]process-running ok Feb 27 17:39:08 crc kubenswrapper[4752]: healthz check failed Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.422937 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jsq6c" podUID="f3b9cef1-7930-44bf-9bc7-5e28f8282e4e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.442994 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4psjf" event={"ID":"bf9d110c-49da-44fc-b366-308548c190ad","Type":"ContainerStarted","Data":"426ec0e7e3aab7f83eae6ce4ed02082aeeb337633e75260bc2cced083ee98c17"} Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.443043 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4psjf" event={"ID":"bf9d110c-49da-44fc-b366-308548c190ad","Type":"ContainerStarted","Data":"067a75a63081ed6cd2a9414993b257cbcd36115f3bd8448aa8eaec6e12346161"} Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.447900 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:08 crc kubenswrapper[4752]: E0227 17:39:08.448222 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:08.948194844 +0000 UTC m=+248.855011685 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.448641 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:08 crc kubenswrapper[4752]: E0227 17:39:08.450795 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:08.950775727 +0000 UTC m=+248.857592578 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.480389 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pwkv" event={"ID":"5992a5c1-ae6c-4bae-98b4-ad86a24b2a4e","Type":"ContainerStarted","Data":"4289720c26b01a67e267879415fcdf35f55286afa4148d075df08d136b710422"} Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.480437 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pwkv" event={"ID":"5992a5c1-ae6c-4bae-98b4-ad86a24b2a4e","Type":"ContainerStarted","Data":"fe8618bb51b005bdeeeafc5f01e11476f08db2dc6e86fca7c6fbaaf34085925e"} Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.501457 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x5tb9" event={"ID":"8dc5b308-08ce-4729-b854-d91947b6fce5","Type":"ContainerStarted","Data":"eb5fc00a27f00a1e855cdf8bdb6fa1bde1b023bb1aea0b78b4443fbc64d3b848"} Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.540232 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pwkv" podStartSLOduration=191.540218066 podStartE2EDuration="3m11.540218066s" podCreationTimestamp="2026-02-27 17:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:08.537675713 +0000 UTC m=+248.444492564" watchObservedRunningTime="2026-02-27 17:39:08.540218066 +0000 UTC m=+248.447034927" Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.553855 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:08 crc kubenswrapper[4752]: E0227 17:39:08.554625 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:09.054606991 +0000 UTC m=+248.961423842 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.560798 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qxgv7" event={"ID":"b07dbba0-06fa-4b50-9ce7-76f943a9a355","Type":"ContainerStarted","Data":"8868766f301023cc57211118206eb468582df22a286ce8b8488f2a3c84146254"} Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.560838 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qxgv7" event={"ID":"b07dbba0-06fa-4b50-9ce7-76f943a9a355","Type":"ContainerStarted","Data":"289e64bd5e00edbceb8dd173b98c2547e91bde8a17bbf43fb8774470f801a0a6"} Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.591658 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xl44c" event={"ID":"081ebfd1-71a9-470a-8f73-9a673f6bcb9b","Type":"ContainerStarted","Data":"71df25f9560e8ffd24a18b3ecea79d704b8f176efc1bec054d70e3ca1572301b"} Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.609625 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qxgv7" podStartSLOduration=192.609587959 podStartE2EDuration="3m12.609587959s" podCreationTimestamp="2026-02-27 17:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:08.601580281 +0000 UTC m=+248.508397132" watchObservedRunningTime="2026-02-27 17:39:08.609587959 +0000 UTC m=+248.516404810" Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.615760 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fb9f6" event={"ID":"582c125f-cc05-442d-9bc0-0e588b1dc998","Type":"ContainerStarted","Data":"856d6ae3d0af4b28057c42e52be28b8158d65ecb5aeb22d808213340b1ee2368"} Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.616721 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fb9f6" Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.658578 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fb9f6" podStartSLOduration=192.658560308 podStartE2EDuration="3m12.658560308s" podCreationTimestamp="2026-02-27 17:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:08.657127793 +0000 UTC m=+248.563944644" watchObservedRunningTime="2026-02-27 17:39:08.658560308 +0000 UTC m=+248.565377159" Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.659277 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:08 crc kubenswrapper[4752]: E0227 17:39:08.678586 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:09.178569052 +0000 UTC m=+249.085385903 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.700981 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lcgcn" event={"ID":"f0bcb3ab-f03a-410d-911e-baffe86632c2","Type":"ContainerStarted","Data":"18883205cdcf273e9247ee84ce1be1e9974b5a7b029e7a4e4ee828ba67d5cf42"} Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.713626 4752 ???:1] "http: TLS handshake error from 192.168.126.11:35332: no serving certificate available for the kubelet" Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.741450 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tbqcp" event={"ID":"bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b","Type":"ContainerStarted","Data":"c3eef9e4529774290ac869a39f10b03c14f5740b64d86e64bed1cb549ec5d122"} Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.742887 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-tbqcp" Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.759875 4752 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-tbqcp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.759955 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-tbqcp" podUID="bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.760374 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:08 crc kubenswrapper[4752]: E0227 17:39:08.760765 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:09.260745471 +0000 UTC m=+249.167562322 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.762565 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lcgcn" podStartSLOduration=192.762554846 podStartE2EDuration="3m12.762554846s" podCreationTimestamp="2026-02-27 17:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:08.761577702 +0000 UTC m=+248.668394553" watchObservedRunningTime="2026-02-27 17:39:08.762554846 +0000 UTC m=+248.669371697" Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.764575 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-kbhp4" event={"ID":"8a2327c4-0233-4973-927c-5b434e75ece4","Type":"ContainerStarted","Data":"3bf4e139287b32174d7ae15801792502522618a8430ef8dc0fe3779caee99a02"} Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.774286 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-rxzmm" event={"ID":"7d737ed6-90b8-4607-bf34-a21992a704e6","Type":"ContainerStarted","Data":"738fdf3193687313219601dd8b5762f39f33aa37543f1852918e784afb05b3a5"} Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.778635 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-5vcfl" event={"ID":"455fc602-83ab-4544-bfa2-01e0b35bc8dc","Type":"ContainerStarted","Data":"2d8b40f98095f3b05c18361f347f8a2874d0a9e3af9a646b13227fea4f044f6b"} Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.793425 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-tbqcp" podStartSLOduration=192.793409828 podStartE2EDuration="3m12.793409828s" podCreationTimestamp="2026-02-27 17:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:08.792769292 +0000 UTC m=+248.699586133" watchObservedRunningTime="2026-02-27 17:39:08.793409828 +0000 UTC m=+248.700226679" Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.805652 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-lhljv" event={"ID":"89afffa7-80af-4d36-9f60-c79ad00c737f","Type":"ContainerStarted","Data":"65161efb092e465ce1c2302afebaacb2f56b5c316c1348cba8a64b2a61f5ea2c"} Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.838091 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-kbhp4" podStartSLOduration=5.838073971 podStartE2EDuration="5.838073971s" podCreationTimestamp="2026-02-27 17:39:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:08.83725576 +0000 UTC m=+248.744072611" watchObservedRunningTime="2026-02-27 17:39:08.838073971 +0000 UTC m=+248.744890822" Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.861409 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-tjktn" event={"ID":"72a3daf3-ca59-4211-9195-1b5c70e4de7c","Type":"ContainerStarted","Data":"43b735de1dfe742d8711de25694f01f3c401c3ff64cb6ed30c78908ab8b7cc02"} Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.862890 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:08 crc kubenswrapper[4752]: E0227 17:39:08.872500 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:09.37247805 +0000 UTC m=+249.279294901 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.905454 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9s4n4" event={"ID":"ae0e3048-c296-4078-a2da-4f630f3e01bc","Type":"ContainerStarted","Data":"b6f2026241063ba8027f8a7b6c5ddd86d533b17a7123562b28299f36b404b4c7"} Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.906630 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9s4n4" Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.928403 4752 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-9s4n4 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.928482 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9s4n4" podUID="ae0e3048-c296-4078-a2da-4f630f3e01bc" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.940088 4752 patch_prober.go:28] interesting pod/downloads-7954f5f757-cfp2v container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.940175 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-cfp2v" podUID="3083b21b-220e-4439-a3c1-18c79f073151" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.946749 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-cfp2v" Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.946784 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536898-598km" event={"ID":"cc36acda-9447-479d-b741-c063ecb91f3e","Type":"ContainerStarted","Data":"d9e5e636fa97f4b425a771395cec6ca9c8622bead9a512d3f403b668a9044791"} Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.962043 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-5vcfl" podStartSLOduration=191.962025921 podStartE2EDuration="3m11.962025921s" podCreationTimestamp="2026-02-27 17:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:08.960717109 +0000 UTC m=+248.867533960" watchObservedRunningTime="2026-02-27 17:39:08.962025921 +0000 UTC m=+248.868842762" Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.963405 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-rxzmm" podStartSLOduration=192.963399095 podStartE2EDuration="3m12.963399095s" podCreationTimestamp="2026-02-27 17:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:08.87977 +0000 UTC m=+248.786586851" watchObservedRunningTime="2026-02-27 17:39:08.963399095 +0000 UTC m=+248.870215946" Feb 27 17:39:08 crc kubenswrapper[4752]: E0227 17:39:08.966102 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:09.466088911 +0000 UTC m=+249.372905762 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.983883 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:08 crc kubenswrapper[4752]: I0227 17:39:08.984250 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:09 crc kubenswrapper[4752]: E0227 17:39:09.019904 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:09.51988344 +0000 UTC m=+249.426700291 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:09 crc kubenswrapper[4752]: I0227 17:39:09.093285 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:09 crc kubenswrapper[4752]: E0227 17:39:09.096412 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:09.596375118 +0000 UTC m=+249.503191969 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:09 crc kubenswrapper[4752]: I0227 17:39:09.134061 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tbqcp"] Feb 27 17:39:09 crc kubenswrapper[4752]: I0227 17:39:09.139964 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9s4n4" podStartSLOduration=192.139942104 podStartE2EDuration="3m12.139942104s" podCreationTimestamp="2026-02-27 17:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:09.064618104 +0000 UTC m=+248.971434955" watchObservedRunningTime="2026-02-27 17:39:09.139942104 +0000 UTC m=+249.046758965" Feb 27 17:39:09 crc kubenswrapper[4752]: I0227 17:39:09.158396 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccqh7" Feb 27 17:39:09 crc kubenswrapper[4752]: I0227 17:39:09.158772 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccqh7" Feb 27 17:39:09 crc kubenswrapper[4752]: I0227 17:39:09.191816 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4vxw"] Feb 27 17:39:09 crc kubenswrapper[4752]: I0227 17:39:09.197847 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:09 crc kubenswrapper[4752]: E0227 17:39:09.198264 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:09.698251054 +0000 UTC m=+249.605067905 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:09 crc kubenswrapper[4752]: I0227 17:39:09.199756 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-tjktn" podStartSLOduration=192.199736861 podStartE2EDuration="3m12.199736861s" podCreationTimestamp="2026-02-27 17:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:09.182758131 +0000 UTC m=+249.089575002" watchObservedRunningTime="2026-02-27 17:39:09.199736861 +0000 UTC m=+249.106553712" Feb 27 17:39:09 crc kubenswrapper[4752]: I0227 17:39:09.204613 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccqh7" Feb 27 17:39:09 crc kubenswrapper[4752]: I0227 17:39:09.305115 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:09 crc kubenswrapper[4752]: E0227 17:39:09.305573 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:09.805557004 +0000 UTC m=+249.712373855 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:09 crc kubenswrapper[4752]: I0227 17:39:09.410486 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:09 crc kubenswrapper[4752]: E0227 17:39:09.410935 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:09.910901575 +0000 UTC m=+249.817718426 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:09 crc kubenswrapper[4752]: I0227 17:39:09.423853 4752 patch_prober.go:28] interesting pod/router-default-5444994796-jsq6c container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 17:39:09 crc kubenswrapper[4752]: [-]has-synced failed: reason withheld Feb 27 17:39:09 crc kubenswrapper[4752]: [+]process-running ok Feb 27 17:39:09 crc kubenswrapper[4752]: healthz check failed Feb 27 17:39:09 crc kubenswrapper[4752]: I0227 17:39:09.423916 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jsq6c" podUID="f3b9cef1-7930-44bf-9bc7-5e28f8282e4e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 17:39:09 crc kubenswrapper[4752]: I0227 17:39:09.511962 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:09 crc kubenswrapper[4752]: E0227 17:39:09.512483 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:10.012456242 +0000 UTC m=+249.919273093 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:09 crc kubenswrapper[4752]: I0227 17:39:09.614256 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:09 crc kubenswrapper[4752]: E0227 17:39:09.614563 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:10.114551453 +0000 UTC m=+250.021368304 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:09 crc kubenswrapper[4752]: I0227 17:39:09.716640 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:09 crc kubenswrapper[4752]: E0227 17:39:09.716835 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:10.216803778 +0000 UTC m=+250.123620629 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:09 crc kubenswrapper[4752]: I0227 17:39:09.717570 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:09 crc kubenswrapper[4752]: E0227 17:39:09.717981 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:10.217970957 +0000 UTC m=+250.124787888 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:09 crc kubenswrapper[4752]: I0227 17:39:09.807223 4752 patch_prober.go:28] interesting pod/console-operator-58897d9998-lhljv container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 27 17:39:09 crc kubenswrapper[4752]: I0227 17:39:09.807289 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-lhljv" podUID="89afffa7-80af-4d36-9f60-c79ad00c737f" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 27 17:39:09 crc kubenswrapper[4752]: I0227 17:39:09.818605 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:09 crc kubenswrapper[4752]: E0227 17:39:09.818771 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:10.318745735 +0000 UTC m=+250.225562586 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:09 crc kubenswrapper[4752]: I0227 17:39:09.818874 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:09 crc kubenswrapper[4752]: E0227 17:39:09.819200 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:10.319191026 +0000 UTC m=+250.226007877 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:09 crc kubenswrapper[4752]: I0227 17:39:09.919889 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:09 crc kubenswrapper[4752]: E0227 17:39:09.920072 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:10.420041637 +0000 UTC m=+250.326858488 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:09 crc kubenswrapper[4752]: I0227 17:39:09.920691 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:09 crc kubenswrapper[4752]: E0227 17:39:09.921072 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:10.421063802 +0000 UTC m=+250.327880653 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:09 crc kubenswrapper[4752]: I0227 17:39:09.949260 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dm4md" event={"ID":"b3cad9bd-0350-4262-86a0-cdb0f3a776ad","Type":"ContainerStarted","Data":"cdf7ef8e21bab6fe97bf60768feb345cb2937f7e7b5d5969b1c66d393cf142ba"} Feb 27 17:39:09 crc kubenswrapper[4752]: I0227 17:39:09.963040 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dmrgx" event={"ID":"9e8dcbe6-373e-4c76-93fa-d30b75dd50db","Type":"ContainerStarted","Data":"4163f70dfde506b5c079fca1d2aa25000e8c7fe2ea394875051829a613a28f58"} Feb 27 17:39:09 crc kubenswrapper[4752]: I0227 17:39:09.963106 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dmrgx" event={"ID":"9e8dcbe6-373e-4c76-93fa-d30b75dd50db","Type":"ContainerStarted","Data":"69dbcfb6a199483b1fbd1b4e87084bb03b196c2f51e1969cce7c67f16bb521fa"} Feb 27 17:39:09 crc kubenswrapper[4752]: I0227 17:39:09.963119 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dmrgx" event={"ID":"9e8dcbe6-373e-4c76-93fa-d30b75dd50db","Type":"ContainerStarted","Data":"e5b0751d9144273daf70a505fbd71dc9ebc334918dec836e44006a816177d6f0"} Feb 27 17:39:09 crc kubenswrapper[4752]: I0227 17:39:09.963214 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dmrgx" Feb 27 17:39:09 crc kubenswrapper[4752]: I0227 17:39:09.968823 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-kbhp4" event={"ID":"8a2327c4-0233-4973-927c-5b434e75ece4","Type":"ContainerStarted","Data":"f78f2963094c820d4ec83f92ac115d6eb014f05e7a6a0c3bb6e5f3ae31925518"} Feb 27 17:39:09 crc kubenswrapper[4752]: I0227 17:39:09.983870 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-krm5q" event={"ID":"127ebcb0-f31e-4857-9a67-842057dd7df4","Type":"ContainerStarted","Data":"27073ee9018c8bfbccde814f94e7c17dd9fc85d09e62ffe9895a5ccb9d6ca784"} Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.002278 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29536890-gjpc2" event={"ID":"1dcb36df-1b47-47d4-933c-24498112a4a6","Type":"ContainerStarted","Data":"1f6627fcebeed4f8c02274d98b5ff5d40d925961a110edc72bb91f5cb382ffe2"} Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.023319 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:10 crc kubenswrapper[4752]: E0227 17:39:10.024837 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:10.524804063 +0000 UTC m=+250.431620914 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.026583 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dmrgx" podStartSLOduration=193.026564127 podStartE2EDuration="3m13.026564127s" podCreationTimestamp="2026-02-27 17:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:10.022716412 +0000 UTC m=+249.929533263" watchObservedRunningTime="2026-02-27 17:39:10.026564127 +0000 UTC m=+249.933380978" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.029109 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-dm4md" podStartSLOduration=8.029103809 podStartE2EDuration="8.029103809s" podCreationTimestamp="2026-02-27 17:39:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:09.975483925 +0000 UTC m=+249.882300776" watchObservedRunningTime="2026-02-27 17:39:10.029103809 +0000 UTC m=+249.935920660" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.040565 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hbfsp" event={"ID":"506337c9-3cdd-451f-a015-6d3e25d43c22","Type":"ContainerStarted","Data":"0a4500209dd0143665aebdb2c2200c1f8dc9f413bf085dfff8613379b992bc8a"} Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.040617 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hbfsp" event={"ID":"506337c9-3cdd-451f-a015-6d3e25d43c22","Type":"ContainerStarted","Data":"e696d86d4dab23b45e9b820bf2174d3328b4269d2ceea73d903b5de6b3a98da7"} Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.063391 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vgtn" event={"ID":"d1444448-daeb-4e89-8b0c-2c97127b00c2","Type":"ContainerStarted","Data":"b5f3625c370ad924970f982462c01f940a6d28df1464f3d5b0deef761fc4b8a8"} Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.063445 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vgtn" event={"ID":"d1444448-daeb-4e89-8b0c-2c97127b00c2","Type":"ContainerStarted","Data":"ecd79d860234ef35f5c1bf7f1f9b30beab77311f55858e2edc0ec7ce9627ce04"} Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.064492 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vgtn" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.068305 4752 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-4vgtn container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:5443/healthz\": dial tcp 10.217.0.35:5443: connect: connection refused" start-of-body= Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.068375 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vgtn" podUID="d1444448-daeb-4e89-8b0c-2c97127b00c2" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.35:5443/healthz\": dial tcp 10.217.0.35:5443: connect: connection refused" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.088412 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-5vcfl" event={"ID":"455fc602-83ab-4544-bfa2-01e0b35bc8dc","Type":"ContainerStarted","Data":"c089b5adb4a76ed9cc98c51bf4e34714e6c022f01f6ad98ec757f219b01ff41d"} Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.094558 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-krm5q" podStartSLOduration=194.094543065 podStartE2EDuration="3m14.094543065s" podCreationTimestamp="2026-02-27 17:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:10.060210158 +0000 UTC m=+249.967027009" watchObservedRunningTime="2026-02-27 17:39:10.094543065 +0000 UTC m=+250.001359916" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.095418 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29536890-gjpc2" podStartSLOduration=194.095413777 podStartE2EDuration="3m14.095413777s" podCreationTimestamp="2026-02-27 17:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:10.09353023 +0000 UTC m=+250.000347091" watchObservedRunningTime="2026-02-27 17:39:10.095413777 +0000 UTC m=+250.002230628" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.124115 4752 ???:1] "http: TLS handshake error from 192.168.126.11:35340: no serving certificate available for the kubelet" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.125703 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:10 crc kubenswrapper[4752]: E0227 17:39:10.126671 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:10.626652298 +0000 UTC m=+250.533469149 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.126951 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vz9rd" event={"ID":"856bbe2c-3e10-42fb-ac3c-470e0057bbaf","Type":"ContainerStarted","Data":"adc7797b47903413daaaacf6f991b9cbaa1905d45caa04004244afe787ce2fb1"} Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.126990 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vz9rd" event={"ID":"856bbe2c-3e10-42fb-ac3c-470e0057bbaf","Type":"ContainerStarted","Data":"142301138e0cecc86eff6a1a0ac7dd02dd78f53b85ef972a874c2b909595e3ab"} Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.127016 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vz9rd" event={"ID":"856bbe2c-3e10-42fb-ac3c-470e0057bbaf","Type":"ContainerStarted","Data":"21f289936e414e467ac474c06abac71bee2c9d6fc32a06223acab6f855a7ce23"} Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.133378 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-hbfsp" podStartSLOduration=194.133351014 podStartE2EDuration="3m14.133351014s" podCreationTimestamp="2026-02-27 17:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:10.129182921 +0000 UTC m=+250.035999772" watchObservedRunningTime="2026-02-27 17:39:10.133351014 +0000 UTC m=+250.040167875" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.142515 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t8w4t"] Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.143776 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t8w4t" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.147788 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.156380 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4psjf" event={"ID":"bf9d110c-49da-44fc-b366-308548c190ad","Type":"ContainerStarted","Data":"14bf56dbe01024a8119625ff2ef7eae361ebca0adfcb56d414862b9bb05c4036"} Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.180856 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" event={"ID":"0480be6e-a859-4bd7-8aad-0a7e5bf06a0e","Type":"ContainerStarted","Data":"8e7cd5dd6b1e1574d340d7525c0d03be477768e3fe79f8809d0fd0c072f1038a"} Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.185484 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vgtn" podStartSLOduration=193.18545266 podStartE2EDuration="3m13.18545266s" podCreationTimestamp="2026-02-27 17:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:10.181382729 +0000 UTC m=+250.088199580" watchObservedRunningTime="2026-02-27 17:39:10.18545266 +0000 UTC m=+250.092269501" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.187684 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t8w4t"] Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.190653 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lps47" event={"ID":"a3047a3c-0344-4f73-a4d1-5f2c278fa1b8","Type":"ContainerStarted","Data":"6824a1630c6502509acf595620c0ad4130d393837e04d8fe369ae4f44fee7a3f"} Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.205490 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s8s94" event={"ID":"b86e38ee-36f1-4bf4-86c0-3b13b1d95103","Type":"ContainerStarted","Data":"379df66ca89f6ddd4fa84744b8fc50617fe776ab9d41e50d23f8d233fe86ef01"} Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.205565 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s8s94" event={"ID":"b86e38ee-36f1-4bf4-86c0-3b13b1d95103","Type":"ContainerStarted","Data":"22422852b451a98129d2794797da509c5bb49cb2c5234c39f4c2ae6ac5f3a987"} Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.205576 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s8s94" event={"ID":"b86e38ee-36f1-4bf4-86c0-3b13b1d95103","Type":"ContainerStarted","Data":"9c74dd548459f4b6af5a3e532052e262bc0ce1f4675d9cf91272c3c6d4b4905f"} Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.223509 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rjvxc" event={"ID":"e372223f-91ea-40f7-93f8-38bb0a08c646","Type":"ContainerStarted","Data":"606db1dff82ff51558e6e91720fa4e0d0421161a3cadf165ab9bffa94105774a"} Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.223567 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rjvxc" event={"ID":"e372223f-91ea-40f7-93f8-38bb0a08c646","Type":"ContainerStarted","Data":"f82c1a4f42de3306c91b475ac59af69c3dcd2e937a9c00a497d86d7a4092888a"} Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.224184 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-rjvxc" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.234696 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:10 crc kubenswrapper[4752]: E0227 17:39:10.234942 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:10.734917011 +0000 UTC m=+250.641733872 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.235047 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:10 crc kubenswrapper[4752]: E0227 17:39:10.235462 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:10.735446374 +0000 UTC m=+250.642263225 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.267161 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-d5rp7" event={"ID":"e61b5d7e-db3b-49d0-94de-95a19c8fc89b","Type":"ContainerStarted","Data":"b18f6af3a4f902bf49ad68243f8af9e5fd9b5ab645776e117153f49ee8402708"} Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.270621 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x5tb9" event={"ID":"8dc5b308-08ce-4729-b854-d91947b6fce5","Type":"ContainerStarted","Data":"fd5b6310c0bd9381063f5261865707988b20c3430d2adb2c09d65c6f91d06b4e"} Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.272279 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqd62" event={"ID":"7d6b9fa5-61ff-425f-b41a-5dbf2f3f7efa","Type":"ContainerStarted","Data":"b9c45e0855886b4f380f9320cbf6b26853744105728c40b41f335ae590722526"} Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.303677 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" podStartSLOduration=194.303652449 podStartE2EDuration="3m14.303652449s" podCreationTimestamp="2026-02-27 17:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:10.242103849 +0000 UTC m=+250.148920710" watchObservedRunningTime="2026-02-27 17:39:10.303652449 +0000 UTC m=+250.210469300" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.308754 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4vxw" podUID="621fbcf2-089a-44b7-8130-e4f188d4b03f" containerName="route-controller-manager" containerID="cri-o://fe8745c17a10cae50196d9cd0f9a1b4a3447c8425188f03ffe0b51de6333fe83" gracePeriod=30 Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.310275 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xl44c" event={"ID":"081ebfd1-71a9-470a-8f73-9a673f6bcb9b","Type":"ContainerStarted","Data":"6bd15e535336295be46eb22c10ede9ff6ae024a77bc0347fe1d6e29b019716c3"} Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.311035 4752 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-fcc4l container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.311064 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-fcc4l" podUID="bb16b639-2f9c-414f-8cae-41f805a10165" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.311122 4752 patch_prober.go:28] interesting pod/downloads-7954f5f757-cfp2v container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.311135 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-cfp2v" podUID="3083b21b-220e-4439-a3c1-18c79f073151" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.331052 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ccqh7" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.334186 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9s4n4" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.336778 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.336934 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl86p\" (UniqueName: \"kubernetes.io/projected/ebdbb722-11b5-43c4-b8dc-8758bbc7164c-kube-api-access-rl86p\") pod \"certified-operators-t8w4t\" (UID: \"ebdbb722-11b5-43c4-b8dc-8758bbc7164c\") " pod="openshift-marketplace/certified-operators-t8w4t" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.336966 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebdbb722-11b5-43c4-b8dc-8758bbc7164c-catalog-content\") pod \"certified-operators-t8w4t\" (UID: \"ebdbb722-11b5-43c4-b8dc-8758bbc7164c\") " pod="openshift-marketplace/certified-operators-t8w4t" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.337286 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebdbb722-11b5-43c4-b8dc-8758bbc7164c-utilities\") pod \"certified-operators-t8w4t\" (UID: \"ebdbb722-11b5-43c4-b8dc-8758bbc7164c\") " pod="openshift-marketplace/certified-operators-t8w4t" Feb 27 17:39:10 crc kubenswrapper[4752]: E0227 17:39:10.338216 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:10.838192211 +0000 UTC m=+250.745009232 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.352138 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vz9rd" podStartSLOduration=194.352110335 podStartE2EDuration="3m14.352110335s" podCreationTimestamp="2026-02-27 17:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:10.306793156 +0000 UTC m=+250.213610007" watchObservedRunningTime="2026-02-27 17:39:10.352110335 +0000 UTC m=+250.258927186" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.353435 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4psjf" podStartSLOduration=194.353425678 podStartE2EDuration="3m14.353425678s" podCreationTimestamp="2026-02-27 17:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:10.340920969 +0000 UTC m=+250.247737840" watchObservedRunningTime="2026-02-27 17:39:10.353425678 +0000 UTC m=+250.260242529" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.370788 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-tbqcp" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.394710 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dm9bt"] Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.412946 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dm9bt" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.430819 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dm9bt"] Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.433461 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.455942 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.456010 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebdbb722-11b5-43c4-b8dc-8758bbc7164c-utilities\") pod \"certified-operators-t8w4t\" (UID: \"ebdbb722-11b5-43c4-b8dc-8758bbc7164c\") " pod="openshift-marketplace/certified-operators-t8w4t" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.456125 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rl86p\" (UniqueName: \"kubernetes.io/projected/ebdbb722-11b5-43c4-b8dc-8758bbc7164c-kube-api-access-rl86p\") pod \"certified-operators-t8w4t\" (UID: \"ebdbb722-11b5-43c4-b8dc-8758bbc7164c\") " pod="openshift-marketplace/certified-operators-t8w4t" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.456203 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebdbb722-11b5-43c4-b8dc-8758bbc7164c-catalog-content\") pod \"certified-operators-t8w4t\" (UID: \"ebdbb722-11b5-43c4-b8dc-8758bbc7164c\") " pod="openshift-marketplace/certified-operators-t8w4t" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.456611 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cad177e6-5ee1-4884-bb19-b9413b183acc-utilities\") pod \"community-operators-dm9bt\" (UID: \"cad177e6-5ee1-4884-bb19-b9413b183acc\") " pod="openshift-marketplace/community-operators-dm9bt" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.456654 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cad177e6-5ee1-4884-bb19-b9413b183acc-catalog-content\") pod \"community-operators-dm9bt\" (UID: \"cad177e6-5ee1-4884-bb19-b9413b183acc\") " pod="openshift-marketplace/community-operators-dm9bt" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.456727 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btb9f\" (UniqueName: \"kubernetes.io/projected/cad177e6-5ee1-4884-bb19-b9413b183acc-kube-api-access-btb9f\") pod \"community-operators-dm9bt\" (UID: \"cad177e6-5ee1-4884-bb19-b9413b183acc\") " pod="openshift-marketplace/community-operators-dm9bt" Feb 27 17:39:10 crc kubenswrapper[4752]: E0227 17:39:10.459612 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:10.959596039 +0000 UTC m=+250.866412880 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.460264 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebdbb722-11b5-43c4-b8dc-8758bbc7164c-utilities\") pod \"certified-operators-t8w4t\" (UID: \"ebdbb722-11b5-43c4-b8dc-8758bbc7164c\") " pod="openshift-marketplace/certified-operators-t8w4t" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.460343 4752 patch_prober.go:28] interesting pod/router-default-5444994796-jsq6c container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 17:39:10 crc kubenswrapper[4752]: [-]has-synced failed: reason withheld Feb 27 17:39:10 crc kubenswrapper[4752]: [+]process-running ok Feb 27 17:39:10 crc kubenswrapper[4752]: healthz check failed Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.460423 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jsq6c" podUID="f3b9cef1-7930-44bf-9bc7-5e28f8282e4e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.469097 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebdbb722-11b5-43c4-b8dc-8758bbc7164c-catalog-content\") pod \"certified-operators-t8w4t\" (UID: \"ebdbb722-11b5-43c4-b8dc-8758bbc7164c\") " pod="openshift-marketplace/certified-operators-t8w4t" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.495259 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-x5tb9" podStartSLOduration=194.495236179 podStartE2EDuration="3m14.495236179s" podCreationTimestamp="2026-02-27 17:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:10.454480423 +0000 UTC m=+250.361297274" watchObservedRunningTime="2026-02-27 17:39:10.495236179 +0000 UTC m=+250.402053030" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.516933 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-lhljv" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.525539 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.533071 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl86p\" (UniqueName: \"kubernetes.io/projected/ebdbb722-11b5-43c4-b8dc-8758bbc7164c-kube-api-access-rl86p\") pod \"certified-operators-t8w4t\" (UID: \"ebdbb722-11b5-43c4-b8dc-8758bbc7164c\") " pod="openshift-marketplace/certified-operators-t8w4t" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.557535 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-694vw"] Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.561687 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.562040 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cad177e6-5ee1-4884-bb19-b9413b183acc-utilities\") pod \"community-operators-dm9bt\" (UID: \"cad177e6-5ee1-4884-bb19-b9413b183acc\") " pod="openshift-marketplace/community-operators-dm9bt" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.562069 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cad177e6-5ee1-4884-bb19-b9413b183acc-catalog-content\") pod \"community-operators-dm9bt\" (UID: \"cad177e6-5ee1-4884-bb19-b9413b183acc\") " pod="openshift-marketplace/community-operators-dm9bt" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.562105 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btb9f\" (UniqueName: \"kubernetes.io/projected/cad177e6-5ee1-4884-bb19-b9413b183acc-kube-api-access-btb9f\") pod \"community-operators-dm9bt\" (UID: \"cad177e6-5ee1-4884-bb19-b9413b183acc\") " pod="openshift-marketplace/community-operators-dm9bt" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.562693 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cad177e6-5ee1-4884-bb19-b9413b183acc-utilities\") pod \"community-operators-dm9bt\" (UID: \"cad177e6-5ee1-4884-bb19-b9413b183acc\") " pod="openshift-marketplace/community-operators-dm9bt" Feb 27 17:39:10 crc kubenswrapper[4752]: E0227 17:39:10.562800 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:11.062782077 +0000 UTC m=+250.969598928 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.563009 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cad177e6-5ee1-4884-bb19-b9413b183acc-catalog-content\") pod \"community-operators-dm9bt\" (UID: \"cad177e6-5ee1-4884-bb19-b9413b183acc\") " pod="openshift-marketplace/community-operators-dm9bt" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.575115 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-694vw" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.590834 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-694vw"] Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.630496 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btb9f\" (UniqueName: \"kubernetes.io/projected/cad177e6-5ee1-4884-bb19-b9413b183acc-kube-api-access-btb9f\") pod \"community-operators-dm9bt\" (UID: \"cad177e6-5ee1-4884-bb19-b9413b183acc\") " pod="openshift-marketplace/community-operators-dm9bt" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.664276 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:10 crc kubenswrapper[4752]: E0227 17:39:10.664631 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:11.164616521 +0000 UTC m=+251.071433372 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.767640 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.767794 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjlzt\" (UniqueName: \"kubernetes.io/projected/2db85625-5324-4606-a2f1-740416e8d218-kube-api-access-sjlzt\") pod \"certified-operators-694vw\" (UID: \"2db85625-5324-4606-a2f1-740416e8d218\") " pod="openshift-marketplace/certified-operators-694vw" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.767871 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2db85625-5324-4606-a2f1-740416e8d218-catalog-content\") pod \"certified-operators-694vw\" (UID: \"2db85625-5324-4606-a2f1-740416e8d218\") " pod="openshift-marketplace/certified-operators-694vw" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.767918 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2db85625-5324-4606-a2f1-740416e8d218-utilities\") pod \"certified-operators-694vw\" (UID: \"2db85625-5324-4606-a2f1-740416e8d218\") " pod="openshift-marketplace/certified-operators-694vw" Feb 27 17:39:10 crc kubenswrapper[4752]: E0227 17:39:10.768018 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:11.268001634 +0000 UTC m=+251.174818485 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.782040 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-rjvxc" podStartSLOduration=7.782026091 podStartE2EDuration="7.782026091s" podCreationTimestamp="2026-02-27 17:39:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:10.717417055 +0000 UTC m=+250.624233906" watchObservedRunningTime="2026-02-27 17:39:10.782026091 +0000 UTC m=+250.688842942" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.783044 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b7x2z"] Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.783991 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b7x2z" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.796007 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t8w4t" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.810396 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b7x2z"] Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.819648 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dm9bt" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.871859 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/899d1101-b4de-4326-b442-6450903b2a30-utilities\") pod \"community-operators-b7x2z\" (UID: \"899d1101-b4de-4326-b442-6450903b2a30\") " pod="openshift-marketplace/community-operators-b7x2z" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.871905 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/899d1101-b4de-4326-b442-6450903b2a30-catalog-content\") pod \"community-operators-b7x2z\" (UID: \"899d1101-b4de-4326-b442-6450903b2a30\") " pod="openshift-marketplace/community-operators-b7x2z" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.871938 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2db85625-5324-4606-a2f1-740416e8d218-catalog-content\") pod \"certified-operators-694vw\" (UID: \"2db85625-5324-4606-a2f1-740416e8d218\") " pod="openshift-marketplace/certified-operators-694vw" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.871990 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2db85625-5324-4606-a2f1-740416e8d218-utilities\") pod \"certified-operators-694vw\" (UID: \"2db85625-5324-4606-a2f1-740416e8d218\") " pod="openshift-marketplace/certified-operators-694vw" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.872010 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrvgh\" (UniqueName: \"kubernetes.io/projected/899d1101-b4de-4326-b442-6450903b2a30-kube-api-access-wrvgh\") pod \"community-operators-b7x2z\" (UID: \"899d1101-b4de-4326-b442-6450903b2a30\") " pod="openshift-marketplace/community-operators-b7x2z" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.872039 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.872054 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjlzt\" (UniqueName: \"kubernetes.io/projected/2db85625-5324-4606-a2f1-740416e8d218-kube-api-access-sjlzt\") pod \"certified-operators-694vw\" (UID: \"2db85625-5324-4606-a2f1-740416e8d218\") " pod="openshift-marketplace/certified-operators-694vw" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.873024 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2db85625-5324-4606-a2f1-740416e8d218-catalog-content\") pod \"certified-operators-694vw\" (UID: \"2db85625-5324-4606-a2f1-740416e8d218\") " pod="openshift-marketplace/certified-operators-694vw" Feb 27 17:39:10 crc kubenswrapper[4752]: E0227 17:39:10.873467 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:11.373450917 +0000 UTC m=+251.280267768 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.874877 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2db85625-5324-4606-a2f1-740416e8d218-utilities\") pod \"certified-operators-694vw\" (UID: \"2db85625-5324-4606-a2f1-740416e8d218\") " pod="openshift-marketplace/certified-operators-694vw" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.978721 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.978942 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrvgh\" (UniqueName: \"kubernetes.io/projected/899d1101-b4de-4326-b442-6450903b2a30-kube-api-access-wrvgh\") pod \"community-operators-b7x2z\" (UID: \"899d1101-b4de-4326-b442-6450903b2a30\") " pod="openshift-marketplace/community-operators-b7x2z" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.979046 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/899d1101-b4de-4326-b442-6450903b2a30-utilities\") pod \"community-operators-b7x2z\" (UID: \"899d1101-b4de-4326-b442-6450903b2a30\") " pod="openshift-marketplace/community-operators-b7x2z" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.979073 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/899d1101-b4de-4326-b442-6450903b2a30-catalog-content\") pod \"community-operators-b7x2z\" (UID: \"899d1101-b4de-4326-b442-6450903b2a30\") " pod="openshift-marketplace/community-operators-b7x2z" Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.979514 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/899d1101-b4de-4326-b442-6450903b2a30-catalog-content\") pod \"community-operators-b7x2z\" (UID: \"899d1101-b4de-4326-b442-6450903b2a30\") " pod="openshift-marketplace/community-operators-b7x2z" Feb 27 17:39:10 crc kubenswrapper[4752]: E0227 17:39:10.979581 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:11.479566627 +0000 UTC m=+251.386383478 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:10 crc kubenswrapper[4752]: I0227 17:39:10.980037 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/899d1101-b4de-4326-b442-6450903b2a30-utilities\") pod \"community-operators-b7x2z\" (UID: \"899d1101-b4de-4326-b442-6450903b2a30\") " pod="openshift-marketplace/community-operators-b7x2z" Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.028520 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjlzt\" (UniqueName: \"kubernetes.io/projected/2db85625-5324-4606-a2f1-740416e8d218-kube-api-access-sjlzt\") pod \"certified-operators-694vw\" (UID: \"2db85625-5324-4606-a2f1-740416e8d218\") " pod="openshift-marketplace/certified-operators-694vw" Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.088046 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:11 crc kubenswrapper[4752]: E0227 17:39:11.088554 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:11.588539808 +0000 UTC m=+251.495356659 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.098210 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrvgh\" (UniqueName: \"kubernetes.io/projected/899d1101-b4de-4326-b442-6450903b2a30-kube-api-access-wrvgh\") pod \"community-operators-b7x2z\" (UID: \"899d1101-b4de-4326-b442-6450903b2a30\") " pod="openshift-marketplace/community-operators-b7x2z" Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.124706 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b7x2z" Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.137009 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqd62" podStartSLOduration=195.136989344 podStartE2EDuration="3m15.136989344s" podCreationTimestamp="2026-02-27 17:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:11.000405492 +0000 UTC m=+250.907222353" watchObservedRunningTime="2026-02-27 17:39:11.136989344 +0000 UTC m=+251.043806195" Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.195499 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:11 crc kubenswrapper[4752]: E0227 17:39:11.195852 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:11.695834837 +0000 UTC m=+251.602651688 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.211527 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-694vw" Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.228762 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s8s94" podStartSLOduration=195.22874541 podStartE2EDuration="3m15.22874541s" podCreationTimestamp="2026-02-27 17:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:11.140193233 +0000 UTC m=+251.047010084" watchObservedRunningTime="2026-02-27 17:39:11.22874541 +0000 UTC m=+251.135562261" Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.230482 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lps47" podStartSLOduration=195.230478053 podStartE2EDuration="3m15.230478053s" podCreationTimestamp="2026-02-27 17:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:11.228204877 +0000 UTC m=+251.135021728" watchObservedRunningTime="2026-02-27 17:39:11.230478053 +0000 UTC m=+251.137294904" Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.293180 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-d5rp7" podStartSLOduration=194.293163491 podStartE2EDuration="3m14.293163491s" podCreationTimestamp="2026-02-27 17:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:11.292355791 +0000 UTC m=+251.199172642" watchObservedRunningTime="2026-02-27 17:39:11.293163491 +0000 UTC m=+251.199980332" Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.294363 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fb9f6" Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.297394 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:11 crc kubenswrapper[4752]: E0227 17:39:11.297734 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:11.797723263 +0000 UTC m=+251.704540114 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.401616 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:11 crc kubenswrapper[4752]: E0227 17:39:11.402046 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:11.902023949 +0000 UTC m=+251.808840800 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.439100 4752 patch_prober.go:28] interesting pod/router-default-5444994796-jsq6c container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 17:39:11 crc kubenswrapper[4752]: [-]has-synced failed: reason withheld Feb 27 17:39:11 crc kubenswrapper[4752]: [+]process-running ok Feb 27 17:39:11 crc kubenswrapper[4752]: healthz check failed Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.439178 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jsq6c" podUID="f3b9cef1-7930-44bf-9bc7-5e28f8282e4e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.445357 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4vxw" Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.470264 4752 generic.go:334] "Generic (PLEG): container finished" podID="621fbcf2-089a-44b7-8130-e4f188d4b03f" containerID="fe8745c17a10cae50196d9cd0f9a1b4a3447c8425188f03ffe0b51de6333fe83" exitCode=0 Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.471598 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4vxw" event={"ID":"621fbcf2-089a-44b7-8130-e4f188d4b03f","Type":"ContainerDied","Data":"fe8745c17a10cae50196d9cd0f9a1b4a3447c8425188f03ffe0b51de6333fe83"} Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.471869 4752 scope.go:117] "RemoveContainer" containerID="fe8745c17a10cae50196d9cd0f9a1b4a3447c8425188f03ffe0b51de6333fe83" Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.481022 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-tbqcp" podUID="bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b" containerName="controller-manager" containerID="cri-o://c3eef9e4529774290ac869a39f10b03c14f5740b64d86e64bed1cb549ec5d122" gracePeriod=30 Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.488790 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-fcc4l" Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.532644 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:11 crc kubenswrapper[4752]: E0227 17:39:11.533364 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:12.033351641 +0000 UTC m=+251.940168492 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.616351 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 27 17:39:11 crc kubenswrapper[4752]: E0227 17:39:11.616647 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="621fbcf2-089a-44b7-8130-e4f188d4b03f" containerName="route-controller-manager" Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.616660 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="621fbcf2-089a-44b7-8130-e4f188d4b03f" containerName="route-controller-manager" Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.616760 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="621fbcf2-089a-44b7-8130-e4f188d4b03f" containerName="route-controller-manager" Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.617115 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.617212 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.629099 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.629392 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.640878 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/621fbcf2-089a-44b7-8130-e4f188d4b03f-client-ca\") pod \"621fbcf2-089a-44b7-8130-e4f188d4b03f\" (UID: \"621fbcf2-089a-44b7-8130-e4f188d4b03f\") " Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.640940 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdhhf\" (UniqueName: \"kubernetes.io/projected/621fbcf2-089a-44b7-8130-e4f188d4b03f-kube-api-access-kdhhf\") pod \"621fbcf2-089a-44b7-8130-e4f188d4b03f\" (UID: \"621fbcf2-089a-44b7-8130-e4f188d4b03f\") " Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.641030 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/621fbcf2-089a-44b7-8130-e4f188d4b03f-serving-cert\") pod \"621fbcf2-089a-44b7-8130-e4f188d4b03f\" (UID: \"621fbcf2-089a-44b7-8130-e4f188d4b03f\") " Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.641256 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.641372 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/621fbcf2-089a-44b7-8130-e4f188d4b03f-config\") pod \"621fbcf2-089a-44b7-8130-e4f188d4b03f\" (UID: \"621fbcf2-089a-44b7-8130-e4f188d4b03f\") " Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.643670 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/621fbcf2-089a-44b7-8130-e4f188d4b03f-client-ca" (OuterVolumeSpecName: "client-ca") pod "621fbcf2-089a-44b7-8130-e4f188d4b03f" (UID: "621fbcf2-089a-44b7-8130-e4f188d4b03f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:39:11 crc kubenswrapper[4752]: E0227 17:39:11.666976 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:12.16695044 +0000 UTC m=+252.073767291 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.667425 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/621fbcf2-089a-44b7-8130-e4f188d4b03f-config" (OuterVolumeSpecName: "config") pod "621fbcf2-089a-44b7-8130-e4f188d4b03f" (UID: "621fbcf2-089a-44b7-8130-e4f188d4b03f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.680371 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/621fbcf2-089a-44b7-8130-e4f188d4b03f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "621fbcf2-089a-44b7-8130-e4f188d4b03f" (UID: "621fbcf2-089a-44b7-8130-e4f188d4b03f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.722360 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/621fbcf2-089a-44b7-8130-e4f188d4b03f-kube-api-access-kdhhf" (OuterVolumeSpecName: "kube-api-access-kdhhf") pod "621fbcf2-089a-44b7-8130-e4f188d4b03f" (UID: "621fbcf2-089a-44b7-8130-e4f188d4b03f"). InnerVolumeSpecName "kube-api-access-kdhhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.747814 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f3fef0c-3415-427e-9ebc-407d820a732c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8f3fef0c-3415-427e-9ebc-407d820a732c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.747882 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8f3fef0c-3415-427e-9ebc-407d820a732c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8f3fef0c-3415-427e-9ebc-407d820a732c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.747959 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.748014 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/621fbcf2-089a-44b7-8130-e4f188d4b03f-config\") on node \"crc\" DevicePath \"\"" Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.748024 4752 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/621fbcf2-089a-44b7-8130-e4f188d4b03f-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.748033 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdhhf\" (UniqueName: \"kubernetes.io/projected/621fbcf2-089a-44b7-8130-e4f188d4b03f-kube-api-access-kdhhf\") on node \"crc\" DevicePath \"\"" Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.748042 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/621fbcf2-089a-44b7-8130-e4f188d4b03f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:39:11 crc kubenswrapper[4752]: E0227 17:39:11.748362 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:12.24834782 +0000 UTC m=+252.155164671 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.784971 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dm9bt"] Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.854643 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.854986 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f3fef0c-3415-427e-9ebc-407d820a732c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8f3fef0c-3415-427e-9ebc-407d820a732c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.855042 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8f3fef0c-3415-427e-9ebc-407d820a732c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8f3fef0c-3415-427e-9ebc-407d820a732c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.855136 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8f3fef0c-3415-427e-9ebc-407d820a732c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8f3fef0c-3415-427e-9ebc-407d820a732c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 17:39:11 crc kubenswrapper[4752]: E0227 17:39:11.855294 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:12.35527812 +0000 UTC m=+252.262094971 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.879353 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f3fef0c-3415-427e-9ebc-407d820a732c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8f3fef0c-3415-427e-9ebc-407d820a732c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 17:39:11 crc kubenswrapper[4752]: I0227 17:39:11.957883 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:11 crc kubenswrapper[4752]: E0227 17:39:11.958219 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:12.458205772 +0000 UTC m=+252.365022613 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.034415 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.063103 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:12 crc kubenswrapper[4752]: E0227 17:39:12.063447 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:12.563410399 +0000 UTC m=+252.470227250 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.063648 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:12 crc kubenswrapper[4752]: E0227 17:39:12.064206 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:12.564197889 +0000 UTC m=+252.471014740 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.166302 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:12 crc kubenswrapper[4752]: E0227 17:39:12.166694 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:12.666641278 +0000 UTC m=+252.573458129 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.166931 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:12 crc kubenswrapper[4752]: E0227 17:39:12.167603 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:12.667581372 +0000 UTC m=+252.574398223 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.198039 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t8w4t"] Feb 27 17:39:12 crc kubenswrapper[4752]: W0227 17:39:12.259453 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebdbb722_11b5_43c4_b8dc_8758bbc7164c.slice/crio-218c6b1bcb4eac96d3646237e251a6dd2118ccaa7c8244de01170a41b8bbfe8b WatchSource:0}: Error finding container 218c6b1bcb4eac96d3646237e251a6dd2118ccaa7c8244de01170a41b8bbfe8b: Status 404 returned error can't find the container with id 218c6b1bcb4eac96d3646237e251a6dd2118ccaa7c8244de01170a41b8bbfe8b Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.263534 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-694vw"] Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.264811 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b7x2z"] Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.267802 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:12 crc kubenswrapper[4752]: E0227 17:39:12.268125 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:12.768086633 +0000 UTC m=+252.674903484 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.268433 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:12 crc kubenswrapper[4752]: E0227 17:39:12.268870 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:12.768849642 +0000 UTC m=+252.675666493 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:12 crc kubenswrapper[4752]: W0227 17:39:12.269338 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2db85625_5324_4606_a2f1_740416e8d218.slice/crio-4e5af055ad8d648a0e0c4f942e2f556bd16aa6a76022acf5bc40b130a1a9c514 WatchSource:0}: Error finding container 4e5af055ad8d648a0e0c4f942e2f556bd16aa6a76022acf5bc40b130a1a9c514: Status 404 returned error can't find the container with id 4e5af055ad8d648a0e0c4f942e2f556bd16aa6a76022acf5bc40b130a1a9c514 Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.269461 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tbqcp" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.337984 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6kwhk"] Feb 27 17:39:12 crc kubenswrapper[4752]: E0227 17:39:12.338385 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b" containerName="controller-manager" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.338416 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b" containerName="controller-manager" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.338573 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b" containerName="controller-manager" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.339456 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6kwhk" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.342109 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.342362 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6kwhk"] Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.369127 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.370710 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b-client-ca\") pod \"bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b\" (UID: \"bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b\") " Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.370981 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.371024 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b-config\") pod \"bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b\" (UID: \"bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b\") " Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.371054 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b-proxy-ca-bundles\") pod \"bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b\" (UID: \"bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b\") " Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.371394 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b-serving-cert\") pod \"bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b\" (UID: \"bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b\") " Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.371489 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjgqk\" (UniqueName: \"kubernetes.io/projected/bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b-kube-api-access-kjgqk\") pod \"bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b\" (UID: \"bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b\") " Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.373819 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b-config" (OuterVolumeSpecName: "config") pod "bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b" (UID: "bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:39:12 crc kubenswrapper[4752]: E0227 17:39:12.373928 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:12.873910576 +0000 UTC m=+252.780727427 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.374234 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b-client-ca" (OuterVolumeSpecName: "client-ca") pod "bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b" (UID: "bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.374754 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b" (UID: "bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.379888 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vgtn" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.381301 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b-kube-api-access-kjgqk" (OuterVolumeSpecName: "kube-api-access-kjgqk") pod "bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b" (UID: "bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b"). InnerVolumeSpecName "kube-api-access-kjgqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.381553 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b" (UID: "bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.419137 4752 patch_prober.go:28] interesting pod/router-default-5444994796-jsq6c container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 17:39:12 crc kubenswrapper[4752]: [-]has-synced failed: reason withheld Feb 27 17:39:12 crc kubenswrapper[4752]: [+]process-running ok Feb 27 17:39:12 crc kubenswrapper[4752]: healthz check failed Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.419203 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jsq6c" podUID="f3b9cef1-7930-44bf-9bc7-5e28f8282e4e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.472591 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.472668 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78323811-0abf-4cc6-921c-5d0e56e895a3-catalog-content\") pod \"redhat-marketplace-6kwhk\" (UID: \"78323811-0abf-4cc6-921c-5d0e56e895a3\") " pod="openshift-marketplace/redhat-marketplace-6kwhk" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.472709 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78323811-0abf-4cc6-921c-5d0e56e895a3-utilities\") pod \"redhat-marketplace-6kwhk\" (UID: \"78323811-0abf-4cc6-921c-5d0e56e895a3\") " pod="openshift-marketplace/redhat-marketplace-6kwhk" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.472743 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92dcf\" (UniqueName: \"kubernetes.io/projected/78323811-0abf-4cc6-921c-5d0e56e895a3-kube-api-access-92dcf\") pod \"redhat-marketplace-6kwhk\" (UID: \"78323811-0abf-4cc6-921c-5d0e56e895a3\") " pod="openshift-marketplace/redhat-marketplace-6kwhk" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.472812 4752 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.472835 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b-config\") on node \"crc\" DevicePath \"\"" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.472845 4752 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.472857 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.472866 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjgqk\" (UniqueName: \"kubernetes.io/projected/bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b-kube-api-access-kjgqk\") on node \"crc\" DevicePath \"\"" Feb 27 17:39:12 crc kubenswrapper[4752]: E0227 17:39:12.473108 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:12.973095445 +0000 UTC m=+252.879912296 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.486267 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8f3fef0c-3415-427e-9ebc-407d820a732c","Type":"ContainerStarted","Data":"de9db09b6fc56ba8473b1a19748331ee452f9b2a5a820373953446db0790b673"} Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.489125 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-694vw" event={"ID":"2db85625-5324-4606-a2f1-740416e8d218","Type":"ContainerStarted","Data":"4e5af055ad8d648a0e0c4f942e2f556bd16aa6a76022acf5bc40b130a1a9c514"} Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.489891 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b7x2z" event={"ID":"899d1101-b4de-4326-b442-6450903b2a30","Type":"ContainerStarted","Data":"a989bc54c41973efce7b203829bacf30e6f75abcef373ba1b2a1d4b779d48764"} Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.492194 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4vxw" event={"ID":"621fbcf2-089a-44b7-8130-e4f188d4b03f","Type":"ContainerDied","Data":"30876044b986f7c558a97b36834e4a9f1187f434ec8f3e1c3bf35165dce044bd"} Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.492295 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4vxw" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.508745 4752 generic.go:334] "Generic (PLEG): container finished" podID="bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b" containerID="c3eef9e4529774290ac869a39f10b03c14f5740b64d86e64bed1cb549ec5d122" exitCode=0 Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.508783 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tbqcp" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.508829 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tbqcp" event={"ID":"bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b","Type":"ContainerDied","Data":"c3eef9e4529774290ac869a39f10b03c14f5740b64d86e64bed1cb549ec5d122"} Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.508856 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tbqcp" event={"ID":"bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b","Type":"ContainerDied","Data":"ea61c46b1177ef7cedb466875f13f81d943ace62d56710820e64c26e67497df3"} Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.508872 4752 scope.go:117] "RemoveContainer" containerID="c3eef9e4529774290ac869a39f10b03c14f5740b64d86e64bed1cb549ec5d122" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.518360 4752 generic.go:334] "Generic (PLEG): container finished" podID="cad177e6-5ee1-4884-bb19-b9413b183acc" containerID="da05ba83c7087e24142734f5125d0aec25569e538b4475bb5905f4b6eeaa7cc9" exitCode=0 Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.518422 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dm9bt" event={"ID":"cad177e6-5ee1-4884-bb19-b9413b183acc","Type":"ContainerDied","Data":"da05ba83c7087e24142734f5125d0aec25569e538b4475bb5905f4b6eeaa7cc9"} Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.518447 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dm9bt" event={"ID":"cad177e6-5ee1-4884-bb19-b9413b183acc","Type":"ContainerStarted","Data":"93833e8fad58d7a12c47f3930647a04cfea06b810ffd8de7518d4734be27553e"} Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.527030 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8w4t" event={"ID":"ebdbb722-11b5-43c4-b8dc-8758bbc7164c","Type":"ContainerStarted","Data":"218c6b1bcb4eac96d3646237e251a6dd2118ccaa7c8244de01170a41b8bbfe8b"} Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.536575 4752 generic.go:334] "Generic (PLEG): container finished" podID="1dcb36df-1b47-47d4-933c-24498112a4a6" containerID="1f6627fcebeed4f8c02274d98b5ff5d40d925961a110edc72bb91f5cb382ffe2" exitCode=0 Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.537052 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29536890-gjpc2" event={"ID":"1dcb36df-1b47-47d4-933c-24498112a4a6","Type":"ContainerDied","Data":"1f6627fcebeed4f8c02274d98b5ff5d40d925961a110edc72bb91f5cb382ffe2"} Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.561872 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4vxw"] Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.562524 4752 scope.go:117] "RemoveContainer" containerID="c3eef9e4529774290ac869a39f10b03c14f5740b64d86e64bed1cb549ec5d122" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.565030 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4vxw"] Feb 27 17:39:12 crc kubenswrapper[4752]: E0227 17:39:12.569025 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3eef9e4529774290ac869a39f10b03c14f5740b64d86e64bed1cb549ec5d122\": container with ID starting with c3eef9e4529774290ac869a39f10b03c14f5740b64d86e64bed1cb549ec5d122 not found: ID does not exist" containerID="c3eef9e4529774290ac869a39f10b03c14f5740b64d86e64bed1cb549ec5d122" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.569109 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3eef9e4529774290ac869a39f10b03c14f5740b64d86e64bed1cb549ec5d122"} err="failed to get container status \"c3eef9e4529774290ac869a39f10b03c14f5740b64d86e64bed1cb549ec5d122\": rpc error: code = NotFound desc = could not find container \"c3eef9e4529774290ac869a39f10b03c14f5740b64d86e64bed1cb549ec5d122\": container with ID starting with c3eef9e4529774290ac869a39f10b03c14f5740b64d86e64bed1cb549ec5d122 not found: ID does not exist" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.575051 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:12 crc kubenswrapper[4752]: E0227 17:39:12.575227 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:13.075200867 +0000 UTC m=+252.982017718 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.583655 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78323811-0abf-4cc6-921c-5d0e56e895a3-utilities\") pod \"redhat-marketplace-6kwhk\" (UID: \"78323811-0abf-4cc6-921c-5d0e56e895a3\") " pod="openshift-marketplace/redhat-marketplace-6kwhk" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.583808 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92dcf\" (UniqueName: \"kubernetes.io/projected/78323811-0abf-4cc6-921c-5d0e56e895a3-kube-api-access-92dcf\") pod \"redhat-marketplace-6kwhk\" (UID: \"78323811-0abf-4cc6-921c-5d0e56e895a3\") " pod="openshift-marketplace/redhat-marketplace-6kwhk" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.584831 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78323811-0abf-4cc6-921c-5d0e56e895a3-utilities\") pod \"redhat-marketplace-6kwhk\" (UID: \"78323811-0abf-4cc6-921c-5d0e56e895a3\") " pod="openshift-marketplace/redhat-marketplace-6kwhk" Feb 27 17:39:12 crc kubenswrapper[4752]: E0227 17:39:12.588159 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:13.088133836 +0000 UTC m=+252.994950677 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.591240 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.591449 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78323811-0abf-4cc6-921c-5d0e56e895a3-catalog-content\") pod \"redhat-marketplace-6kwhk\" (UID: \"78323811-0abf-4cc6-921c-5d0e56e895a3\") " pod="openshift-marketplace/redhat-marketplace-6kwhk" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.591890 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78323811-0abf-4cc6-921c-5d0e56e895a3-catalog-content\") pod \"redhat-marketplace-6kwhk\" (UID: \"78323811-0abf-4cc6-921c-5d0e56e895a3\") " pod="openshift-marketplace/redhat-marketplace-6kwhk" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.596402 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tbqcp"] Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.599864 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tbqcp"] Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.608466 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92dcf\" (UniqueName: \"kubernetes.io/projected/78323811-0abf-4cc6-921c-5d0e56e895a3-kube-api-access-92dcf\") pod \"redhat-marketplace-6kwhk\" (UID: \"78323811-0abf-4cc6-921c-5d0e56e895a3\") " pod="openshift-marketplace/redhat-marketplace-6kwhk" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.682323 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6kwhk" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.695462 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:12 crc kubenswrapper[4752]: E0227 17:39:12.698054 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:13.19803389 +0000 UTC m=+253.104850741 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.698370 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:12 crc kubenswrapper[4752]: E0227 17:39:12.699578 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:13.199560127 +0000 UTC m=+253.106377068 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.723562 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zhhxr"] Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.724834 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zhhxr" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.725859 4752 ???:1] "http: TLS handshake error from 192.168.126.11:35350: no serving certificate available for the kubelet" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.743293 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zhhxr"] Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.801662 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:12 crc kubenswrapper[4752]: E0227 17:39:12.802207 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:13.302187421 +0000 UTC m=+253.209004272 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.889664 4752 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.909664 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24ds4\" (UniqueName: \"kubernetes.io/projected/760298d8-7405-4c9e-b322-b08dbc182da8-kube-api-access-24ds4\") pod \"redhat-marketplace-zhhxr\" (UID: \"760298d8-7405-4c9e-b322-b08dbc182da8\") " pod="openshift-marketplace/redhat-marketplace-zhhxr" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.909712 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/760298d8-7405-4c9e-b322-b08dbc182da8-catalog-content\") pod \"redhat-marketplace-zhhxr\" (UID: \"760298d8-7405-4c9e-b322-b08dbc182da8\") " pod="openshift-marketplace/redhat-marketplace-zhhxr" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.909769 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.909852 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/760298d8-7405-4c9e-b322-b08dbc182da8-utilities\") pod \"redhat-marketplace-zhhxr\" (UID: \"760298d8-7405-4c9e-b322-b08dbc182da8\") " pod="openshift-marketplace/redhat-marketplace-zhhxr" Feb 27 17:39:12 crc kubenswrapper[4752]: E0227 17:39:12.910016 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:13.410004613 +0000 UTC m=+253.316821464 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.920622 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="621fbcf2-089a-44b7-8130-e4f188d4b03f" path="/var/lib/kubelet/pods/621fbcf2-089a-44b7-8130-e4f188d4b03f/volumes" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.921356 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b" path="/var/lib/kubelet/pods/bc4c8c40-7b60-44c4-a1f4-e7c2ed84035b/volumes" Feb 27 17:39:12 crc kubenswrapper[4752]: I0227 17:39:12.937641 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6kwhk"] Feb 27 17:39:12 crc kubenswrapper[4752]: W0227 17:39:12.950438 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78323811_0abf_4cc6_921c_5d0e56e895a3.slice/crio-80a6ca63d9072e0788ee8e2f438f632f56651e2a5afba730fab068d44a8eaa5e WatchSource:0}: Error finding container 80a6ca63d9072e0788ee8e2f438f632f56651e2a5afba730fab068d44a8eaa5e: Status 404 returned error can't find the container with id 80a6ca63d9072e0788ee8e2f438f632f56651e2a5afba730fab068d44a8eaa5e Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.010742 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:13 crc kubenswrapper[4752]: E0227 17:39:13.010840 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:13.510810573 +0000 UTC m=+253.417627414 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.011166 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24ds4\" (UniqueName: \"kubernetes.io/projected/760298d8-7405-4c9e-b322-b08dbc182da8-kube-api-access-24ds4\") pod \"redhat-marketplace-zhhxr\" (UID: \"760298d8-7405-4c9e-b322-b08dbc182da8\") " pod="openshift-marketplace/redhat-marketplace-zhhxr" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.011223 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/760298d8-7405-4c9e-b322-b08dbc182da8-catalog-content\") pod \"redhat-marketplace-zhhxr\" (UID: \"760298d8-7405-4c9e-b322-b08dbc182da8\") " pod="openshift-marketplace/redhat-marketplace-zhhxr" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.011258 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.011293 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/760298d8-7405-4c9e-b322-b08dbc182da8-utilities\") pod \"redhat-marketplace-zhhxr\" (UID: \"760298d8-7405-4c9e-b322-b08dbc182da8\") " pod="openshift-marketplace/redhat-marketplace-zhhxr" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.011882 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/760298d8-7405-4c9e-b322-b08dbc182da8-utilities\") pod \"redhat-marketplace-zhhxr\" (UID: \"760298d8-7405-4c9e-b322-b08dbc182da8\") " pod="openshift-marketplace/redhat-marketplace-zhhxr" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.012658 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/760298d8-7405-4c9e-b322-b08dbc182da8-catalog-content\") pod \"redhat-marketplace-zhhxr\" (UID: \"760298d8-7405-4c9e-b322-b08dbc182da8\") " pod="openshift-marketplace/redhat-marketplace-zhhxr" Feb 27 17:39:13 crc kubenswrapper[4752]: E0227 17:39:13.012725 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:13.512709769 +0000 UTC m=+253.419526620 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.033135 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24ds4\" (UniqueName: \"kubernetes.io/projected/760298d8-7405-4c9e-b322-b08dbc182da8-kube-api-access-24ds4\") pod \"redhat-marketplace-zhhxr\" (UID: \"760298d8-7405-4c9e-b322-b08dbc182da8\") " pod="openshift-marketplace/redhat-marketplace-zhhxr" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.111831 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:13 crc kubenswrapper[4752]: E0227 17:39:13.112005 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:13.611986961 +0000 UTC m=+253.518803812 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.112373 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:13 crc kubenswrapper[4752]: E0227 17:39:13.112693 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:13.612685898 +0000 UTC m=+253.519502749 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.148292 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zhhxr" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.213346 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:13 crc kubenswrapper[4752]: E0227 17:39:13.213486 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-27 17:39:13.713459176 +0000 UTC m=+253.620276027 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.213694 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:13 crc kubenswrapper[4752]: E0227 17:39:13.214129 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-27 17:39:13.714111073 +0000 UTC m=+253.620927924 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r47g5" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.215563 4752 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-27T17:39:12.889957579Z","Handler":null,"Name":""} Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.228326 4752 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.228359 4752 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.315207 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84fb987488-h86k2"] Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.316349 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.317641 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84fb987488-h86k2" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.322473 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.322789 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.323176 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.323439 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.323678 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.324397 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.328224 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.329769 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qvj4t"] Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.334577 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qvj4t" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.334852 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84fb987488-h86k2"] Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.354733 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qvj4t"] Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.356230 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.417632 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.422256 4752 patch_prober.go:28] interesting pod/router-default-5444994796-jsq6c container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 17:39:13 crc kubenswrapper[4752]: [-]has-synced failed: reason withheld Feb 27 17:39:13 crc kubenswrapper[4752]: [+]process-running ok Feb 27 17:39:13 crc kubenswrapper[4752]: healthz check failed Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.422429 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jsq6c" podUID="f3b9cef1-7930-44bf-9bc7-5e28f8282e4e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.434216 4752 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.434299 4752 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.519249 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l262m\" (UniqueName: \"kubernetes.io/projected/137184c7-4f82-4685-89fa-d5152358e216-kube-api-access-l262m\") pod \"redhat-operators-qvj4t\" (UID: \"137184c7-4f82-4685-89fa-d5152358e216\") " pod="openshift-marketplace/redhat-operators-qvj4t" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.519324 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/137184c7-4f82-4685-89fa-d5152358e216-catalog-content\") pod \"redhat-operators-qvj4t\" (UID: \"137184c7-4f82-4685-89fa-d5152358e216\") " pod="openshift-marketplace/redhat-operators-qvj4t" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.519388 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3297aa73-488b-4cac-bb32-defff1256ff3-serving-cert\") pod \"route-controller-manager-84fb987488-h86k2\" (UID: \"3297aa73-488b-4cac-bb32-defff1256ff3\") " pod="openshift-route-controller-manager/route-controller-manager-84fb987488-h86k2" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.519417 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3297aa73-488b-4cac-bb32-defff1256ff3-config\") pod \"route-controller-manager-84fb987488-h86k2\" (UID: \"3297aa73-488b-4cac-bb32-defff1256ff3\") " pod="openshift-route-controller-manager/route-controller-manager-84fb987488-h86k2" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.519440 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xczj\" (UniqueName: \"kubernetes.io/projected/3297aa73-488b-4cac-bb32-defff1256ff3-kube-api-access-8xczj\") pod \"route-controller-manager-84fb987488-h86k2\" (UID: \"3297aa73-488b-4cac-bb32-defff1256ff3\") " pod="openshift-route-controller-manager/route-controller-manager-84fb987488-h86k2" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.519502 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/137184c7-4f82-4685-89fa-d5152358e216-utilities\") pod \"redhat-operators-qvj4t\" (UID: \"137184c7-4f82-4685-89fa-d5152358e216\") " pod="openshift-marketplace/redhat-operators-qvj4t" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.519582 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3297aa73-488b-4cac-bb32-defff1256ff3-client-ca\") pod \"route-controller-manager-84fb987488-h86k2\" (UID: \"3297aa73-488b-4cac-bb32-defff1256ff3\") " pod="openshift-route-controller-manager/route-controller-manager-84fb987488-h86k2" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.526021 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cllc6"] Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.527073 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cllc6" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.555322 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cllc6"] Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.565649 4752 generic.go:334] "Generic (PLEG): container finished" podID="899d1101-b4de-4326-b442-6450903b2a30" containerID="1256f9673c37e0b3c9e151af931c3405e0a589bc9518d95a73406e42e9d096c2" exitCode=0 Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.565702 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b7x2z" event={"ID":"899d1101-b4de-4326-b442-6450903b2a30","Type":"ContainerDied","Data":"1256f9673c37e0b3c9e151af931c3405e0a589bc9518d95a73406e42e9d096c2"} Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.568702 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r47g5\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.573479 4752 generic.go:334] "Generic (PLEG): container finished" podID="78323811-0abf-4cc6-921c-5d0e56e895a3" containerID="81b581e3c047bb180deb6110ca6cb6d537277e7ffaf240d1aecd6e4c6679653f" exitCode=0 Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.573537 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6kwhk" event={"ID":"78323811-0abf-4cc6-921c-5d0e56e895a3","Type":"ContainerDied","Data":"81b581e3c047bb180deb6110ca6cb6d537277e7ffaf240d1aecd6e4c6679653f"} Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.573564 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6kwhk" event={"ID":"78323811-0abf-4cc6-921c-5d0e56e895a3","Type":"ContainerStarted","Data":"80a6ca63d9072e0788ee8e2f438f632f56651e2a5afba730fab068d44a8eaa5e"} Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.576294 4752 generic.go:334] "Generic (PLEG): container finished" podID="ebdbb722-11b5-43c4-b8dc-8758bbc7164c" containerID="ccff345bec7c8faae051b8382ed00bf24f3490e117e60693c1104c00b7908e3f" exitCode=0 Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.576844 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8w4t" event={"ID":"ebdbb722-11b5-43c4-b8dc-8758bbc7164c","Type":"ContainerDied","Data":"ccff345bec7c8faae051b8382ed00bf24f3490e117e60693c1104c00b7908e3f"} Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.585404 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xl44c" event={"ID":"081ebfd1-71a9-470a-8f73-9a673f6bcb9b","Type":"ContainerStarted","Data":"7261a4986b26653df92aa3896d73efef40f0f6543ac40a4f23d5bf5a09e16d9e"} Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.585435 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xl44c" event={"ID":"081ebfd1-71a9-470a-8f73-9a673f6bcb9b","Type":"ContainerStarted","Data":"b345cd0242cb1586eda9bc2ddbf0a40ce73b7f2547cb86ab2c789cbbbd938953"} Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.585462 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xl44c" event={"ID":"081ebfd1-71a9-470a-8f73-9a673f6bcb9b","Type":"ContainerStarted","Data":"5e5e8f7f114208148e95845a2cfbabff66936a114960692f0c391e02cea9b922"} Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.590378 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8f3fef0c-3415-427e-9ebc-407d820a732c","Type":"ContainerStarted","Data":"e227ae0154f485a1f7d9fe91f6dc131ea06c2fed713062a973163e9f92a7d28a"} Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.606993 4752 generic.go:334] "Generic (PLEG): container finished" podID="2db85625-5324-4606-a2f1-740416e8d218" containerID="43f0cfa84a9cb4890e29806b8a305a7651dfa7ddb553f2bf9ca7381a8339fe36" exitCode=0 Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.607748 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-694vw" event={"ID":"2db85625-5324-4606-a2f1-740416e8d218","Type":"ContainerDied","Data":"43f0cfa84a9cb4890e29806b8a305a7651dfa7ddb553f2bf9ca7381a8339fe36"} Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.621529 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3297aa73-488b-4cac-bb32-defff1256ff3-client-ca\") pod \"route-controller-manager-84fb987488-h86k2\" (UID: \"3297aa73-488b-4cac-bb32-defff1256ff3\") " pod="openshift-route-controller-manager/route-controller-manager-84fb987488-h86k2" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.621584 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l262m\" (UniqueName: \"kubernetes.io/projected/137184c7-4f82-4685-89fa-d5152358e216-kube-api-access-l262m\") pod \"redhat-operators-qvj4t\" (UID: \"137184c7-4f82-4685-89fa-d5152358e216\") " pod="openshift-marketplace/redhat-operators-qvj4t" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.621606 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/049892e0-329b-442b-b232-997ee454f9c6-utilities\") pod \"redhat-operators-cllc6\" (UID: \"049892e0-329b-442b-b232-997ee454f9c6\") " pod="openshift-marketplace/redhat-operators-cllc6" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.621624 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/137184c7-4f82-4685-89fa-d5152358e216-catalog-content\") pod \"redhat-operators-qvj4t\" (UID: \"137184c7-4f82-4685-89fa-d5152358e216\") " pod="openshift-marketplace/redhat-operators-qvj4t" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.621649 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/049892e0-329b-442b-b232-997ee454f9c6-catalog-content\") pod \"redhat-operators-cllc6\" (UID: \"049892e0-329b-442b-b232-997ee454f9c6\") " pod="openshift-marketplace/redhat-operators-cllc6" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.621683 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3297aa73-488b-4cac-bb32-defff1256ff3-serving-cert\") pod \"route-controller-manager-84fb987488-h86k2\" (UID: \"3297aa73-488b-4cac-bb32-defff1256ff3\") " pod="openshift-route-controller-manager/route-controller-manager-84fb987488-h86k2" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.621706 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3297aa73-488b-4cac-bb32-defff1256ff3-config\") pod \"route-controller-manager-84fb987488-h86k2\" (UID: \"3297aa73-488b-4cac-bb32-defff1256ff3\") " pod="openshift-route-controller-manager/route-controller-manager-84fb987488-h86k2" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.621723 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xczj\" (UniqueName: \"kubernetes.io/projected/3297aa73-488b-4cac-bb32-defff1256ff3-kube-api-access-8xczj\") pod \"route-controller-manager-84fb987488-h86k2\" (UID: \"3297aa73-488b-4cac-bb32-defff1256ff3\") " pod="openshift-route-controller-manager/route-controller-manager-84fb987488-h86k2" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.621744 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxkqq\" (UniqueName: \"kubernetes.io/projected/049892e0-329b-442b-b232-997ee454f9c6-kube-api-access-dxkqq\") pod \"redhat-operators-cllc6\" (UID: \"049892e0-329b-442b-b232-997ee454f9c6\") " pod="openshift-marketplace/redhat-operators-cllc6" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.621787 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/137184c7-4f82-4685-89fa-d5152358e216-utilities\") pod \"redhat-operators-qvj4t\" (UID: \"137184c7-4f82-4685-89fa-d5152358e216\") " pod="openshift-marketplace/redhat-operators-qvj4t" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.622567 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3297aa73-488b-4cac-bb32-defff1256ff3-client-ca\") pod \"route-controller-manager-84fb987488-h86k2\" (UID: \"3297aa73-488b-4cac-bb32-defff1256ff3\") " pod="openshift-route-controller-manager/route-controller-manager-84fb987488-h86k2" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.622904 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/137184c7-4f82-4685-89fa-d5152358e216-catalog-content\") pod \"redhat-operators-qvj4t\" (UID: \"137184c7-4f82-4685-89fa-d5152358e216\") " pod="openshift-marketplace/redhat-operators-qvj4t" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.626033 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3297aa73-488b-4cac-bb32-defff1256ff3-config\") pod \"route-controller-manager-84fb987488-h86k2\" (UID: \"3297aa73-488b-4cac-bb32-defff1256ff3\") " pod="openshift-route-controller-manager/route-controller-manager-84fb987488-h86k2" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.628183 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/137184c7-4f82-4685-89fa-d5152358e216-utilities\") pod \"redhat-operators-qvj4t\" (UID: \"137184c7-4f82-4685-89fa-d5152358e216\") " pod="openshift-marketplace/redhat-operators-qvj4t" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.628658 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zhhxr"] Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.645053 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3297aa73-488b-4cac-bb32-defff1256ff3-serving-cert\") pod \"route-controller-manager-84fb987488-h86k2\" (UID: \"3297aa73-488b-4cac-bb32-defff1256ff3\") " pod="openshift-route-controller-manager/route-controller-manager-84fb987488-h86k2" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.645243 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l262m\" (UniqueName: \"kubernetes.io/projected/137184c7-4f82-4685-89fa-d5152358e216-kube-api-access-l262m\") pod \"redhat-operators-qvj4t\" (UID: \"137184c7-4f82-4685-89fa-d5152358e216\") " pod="openshift-marketplace/redhat-operators-qvj4t" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.660743 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xczj\" (UniqueName: \"kubernetes.io/projected/3297aa73-488b-4cac-bb32-defff1256ff3-kube-api-access-8xczj\") pod \"route-controller-manager-84fb987488-h86k2\" (UID: \"3297aa73-488b-4cac-bb32-defff1256ff3\") " pod="openshift-route-controller-manager/route-controller-manager-84fb987488-h86k2" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.673096 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qvj4t" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.710529 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-xl44c" podStartSLOduration=11.7105087 podStartE2EDuration="11.7105087s" podCreationTimestamp="2026-02-27 17:39:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:13.707612118 +0000 UTC m=+253.614428979" watchObservedRunningTime="2026-02-27 17:39:13.7105087 +0000 UTC m=+253.617325551" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.723171 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/049892e0-329b-442b-b232-997ee454f9c6-utilities\") pod \"redhat-operators-cllc6\" (UID: \"049892e0-329b-442b-b232-997ee454f9c6\") " pod="openshift-marketplace/redhat-operators-cllc6" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.723215 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/049892e0-329b-442b-b232-997ee454f9c6-catalog-content\") pod \"redhat-operators-cllc6\" (UID: \"049892e0-329b-442b-b232-997ee454f9c6\") " pod="openshift-marketplace/redhat-operators-cllc6" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.723260 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxkqq\" (UniqueName: \"kubernetes.io/projected/049892e0-329b-442b-b232-997ee454f9c6-kube-api-access-dxkqq\") pod \"redhat-operators-cllc6\" (UID: \"049892e0-329b-442b-b232-997ee454f9c6\") " pod="openshift-marketplace/redhat-operators-cllc6" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.724195 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/049892e0-329b-442b-b232-997ee454f9c6-utilities\") pod \"redhat-operators-cllc6\" (UID: \"049892e0-329b-442b-b232-997ee454f9c6\") " pod="openshift-marketplace/redhat-operators-cllc6" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.724414 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/049892e0-329b-442b-b232-997ee454f9c6-catalog-content\") pod \"redhat-operators-cllc6\" (UID: \"049892e0-329b-442b-b232-997ee454f9c6\") " pod="openshift-marketplace/redhat-operators-cllc6" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.746294 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxkqq\" (UniqueName: \"kubernetes.io/projected/049892e0-329b-442b-b232-997ee454f9c6-kube-api-access-dxkqq\") pod \"redhat-operators-cllc6\" (UID: \"049892e0-329b-442b-b232-997ee454f9c6\") " pod="openshift-marketplace/redhat-operators-cllc6" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.819310 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.861939 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cllc6" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.871505 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536890-gjpc2" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.933752 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1dcb36df-1b47-47d4-933c-24498112a4a6-secret-volume\") pod \"1dcb36df-1b47-47d4-933c-24498112a4a6\" (UID: \"1dcb36df-1b47-47d4-933c-24498112a4a6\") " Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.933859 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d54g\" (UniqueName: \"kubernetes.io/projected/1dcb36df-1b47-47d4-933c-24498112a4a6-kube-api-access-4d54g\") pod \"1dcb36df-1b47-47d4-933c-24498112a4a6\" (UID: \"1dcb36df-1b47-47d4-933c-24498112a4a6\") " Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.933915 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1dcb36df-1b47-47d4-933c-24498112a4a6-config-volume\") pod \"1dcb36df-1b47-47d4-933c-24498112a4a6\" (UID: \"1dcb36df-1b47-47d4-933c-24498112a4a6\") " Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.935428 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dcb36df-1b47-47d4-933c-24498112a4a6-config-volume" (OuterVolumeSpecName: "config-volume") pod "1dcb36df-1b47-47d4-933c-24498112a4a6" (UID: "1dcb36df-1b47-47d4-933c-24498112a4a6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.936012 4752 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1dcb36df-1b47-47d4-933c-24498112a4a6-config-volume\") on node \"crc\" DevicePath \"\"" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.939538 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84fb987488-h86k2" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.940992 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qvj4t"] Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.943158 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dcb36df-1b47-47d4-933c-24498112a4a6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1dcb36df-1b47-47d4-933c-24498112a4a6" (UID: "1dcb36df-1b47-47d4-933c-24498112a4a6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:39:13 crc kubenswrapper[4752]: I0227 17:39:13.944594 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dcb36df-1b47-47d4-933c-24498112a4a6-kube-api-access-4d54g" (OuterVolumeSpecName: "kube-api-access-4d54g") pod "1dcb36df-1b47-47d4-933c-24498112a4a6" (UID: "1dcb36df-1b47-47d4-933c-24498112a4a6"). InnerVolumeSpecName "kube-api-access-4d54g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:39:13 crc kubenswrapper[4752]: W0227 17:39:13.977374 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod137184c7_4f82_4685_89fa_d5152358e216.slice/crio-b71ab18bb56ca20e3c29d69e08973645a3cec4a658e437a266506e2cf59bf5b0 WatchSource:0}: Error finding container b71ab18bb56ca20e3c29d69e08973645a3cec4a658e437a266506e2cf59bf5b0: Status 404 returned error can't find the container with id b71ab18bb56ca20e3c29d69e08973645a3cec4a658e437a266506e2cf59bf5b0 Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.042122 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.042250 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.042309 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.042668 4752 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1dcb36df-1b47-47d4-933c-24498112a4a6-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.042692 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d54g\" (UniqueName: \"kubernetes.io/projected/1dcb36df-1b47-47d4-933c-24498112a4a6-kube-api-access-4d54g\") on node \"crc\" DevicePath \"\"" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.044324 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.048613 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.050082 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.092554 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-r47g5"] Feb 27 17:39:14 crc kubenswrapper[4752]: W0227 17:39:14.117030 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57573690_e945_43f5_b3ed_e3451f5a8a47.slice/crio-637a6aaff9e442b73c8adbb7bedb8cae14905e19dd33712c20caabc6b8f0f12f WatchSource:0}: Error finding container 637a6aaff9e442b73c8adbb7bedb8cae14905e19dd33712c20caabc6b8f0f12f: Status 404 returned error can't find the container with id 637a6aaff9e442b73c8adbb7bedb8cae14905e19dd33712c20caabc6b8f0f12f Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.150160 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/937bbb35-a3c2-435c-86c5-1072f3a54595-metrics-certs\") pod \"network-metrics-daemon-jkjwj\" (UID: \"937bbb35-a3c2-435c-86c5-1072f3a54595\") " pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.150236 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.161005 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.161488 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/937bbb35-a3c2-435c-86c5-1072f3a54595-metrics-certs\") pod \"network-metrics-daemon-jkjwj\" (UID: \"937bbb35-a3c2-435c-86c5-1072f3a54595\") " pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.219357 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.219431 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.230904 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.234526 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.242742 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84fb987488-h86k2"] Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.246167 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 27 17:39:14 crc kubenswrapper[4752]: E0227 17:39:14.261073 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/certified-operator-index@sha256=625372062485d8ed1e4e84c388a7d036cb39c1b93d8c56dd3418fce0c028b62b/signature-2: status 500 (Internal Server Error)" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 27 17:39:14 crc kubenswrapper[4752]: E0227 17:39:14.263372 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rl86p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-t8w4t_openshift-marketplace(ebdbb722-11b5-43c4-b8dc-8758bbc7164c): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/certified-operator-index@sha256=625372062485d8ed1e4e84c388a7d036cb39c1b93d8c56dd3418fce0c028b62b/signature-2: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.264007 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:39:14 crc kubenswrapper[4752]: E0227 17:39:14.264662 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/certified-operator-index@sha256=625372062485d8ed1e4e84c388a7d036cb39c1b93d8c56dd3418fce0c028b62b/signature-2: status 500 (Internal Server Error)\"" pod="openshift-marketplace/certified-operators-t8w4t" podUID="ebdbb722-11b5-43c4-b8dc-8758bbc7164c" Feb 27 17:39:14 crc kubenswrapper[4752]: E0227 17:39:14.267914 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/community-operator-index@sha256=886ecdbcb5b8f90338063f6476072fab73c2a9a65b9f2b3835b7bd01c69794c1/signature-2: status 500 (Internal Server Error)" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 27 17:39:14 crc kubenswrapper[4752]: E0227 17:39:14.268074 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wrvgh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-b7x2z_openshift-marketplace(899d1101-b4de-4326-b442-6450903b2a30): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/community-operator-index@sha256=886ecdbcb5b8f90338063f6476072fab73c2a9a65b9f2b3835b7bd01c69794c1/signature-2: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 17:39:14 crc kubenswrapper[4752]: E0227 17:39:14.269339 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/community-operator-index@sha256=886ecdbcb5b8f90338063f6476072fab73c2a9a65b9f2b3835b7bd01c69794c1/signature-2: status 500 (Internal Server Error)\"" pod="openshift-marketplace/community-operators-b7x2z" podUID="899d1101-b4de-4326-b442-6450903b2a30" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.273400 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jkjwj" Feb 27 17:39:14 crc kubenswrapper[4752]: E0227 17:39:14.277532 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/certified-operator-index@sha256=625372062485d8ed1e4e84c388a7d036cb39c1b93d8c56dd3418fce0c028b62b/signature-2: status 500 (Internal Server Error)" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 27 17:39:14 crc kubenswrapper[4752]: E0227 17:39:14.278478 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sjlzt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-694vw_openshift-marketplace(2db85625-5324-4606-a2f1-740416e8d218): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/certified-operator-index@sha256=625372062485d8ed1e4e84c388a7d036cb39c1b93d8c56dd3418fce0c028b62b/signature-2: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 17:39:14 crc kubenswrapper[4752]: E0227 17:39:14.279699 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/certified-operator-index@sha256=625372062485d8ed1e4e84c388a7d036cb39c1b93d8c56dd3418fce0c028b62b/signature-2: status 500 (Internal Server Error)\"" pod="openshift-marketplace/certified-operators-694vw" podUID="2db85625-5324-4606-a2f1-740416e8d218" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.312596 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5d6b8dcf69-f7mfs"] Feb 27 17:39:14 crc kubenswrapper[4752]: E0227 17:39:14.312845 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dcb36df-1b47-47d4-933c-24498112a4a6" containerName="collect-profiles" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.312861 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dcb36df-1b47-47d4-933c-24498112a4a6" containerName="collect-profiles" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.312961 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dcb36df-1b47-47d4-933c-24498112a4a6" containerName="collect-profiles" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.313371 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d6b8dcf69-f7mfs" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.316216 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.316264 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.318545 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.318663 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.319356 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.319362 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.329351 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.332765 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d6b8dcf69-f7mfs"] Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.369853 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bf462527-46a1-4092-b3de-bef4e89e73a6-proxy-ca-bundles\") pod \"controller-manager-5d6b8dcf69-f7mfs\" (UID: \"bf462527-46a1-4092-b3de-bef4e89e73a6\") " pod="openshift-controller-manager/controller-manager-5d6b8dcf69-f7mfs" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.369911 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf462527-46a1-4092-b3de-bef4e89e73a6-config\") pod \"controller-manager-5d6b8dcf69-f7mfs\" (UID: \"bf462527-46a1-4092-b3de-bef4e89e73a6\") " pod="openshift-controller-manager/controller-manager-5d6b8dcf69-f7mfs" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.369948 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf462527-46a1-4092-b3de-bef4e89e73a6-serving-cert\") pod \"controller-manager-5d6b8dcf69-f7mfs\" (UID: \"bf462527-46a1-4092-b3de-bef4e89e73a6\") " pod="openshift-controller-manager/controller-manager-5d6b8dcf69-f7mfs" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.369970 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf462527-46a1-4092-b3de-bef4e89e73a6-client-ca\") pod \"controller-manager-5d6b8dcf69-f7mfs\" (UID: \"bf462527-46a1-4092-b3de-bef4e89e73a6\") " pod="openshift-controller-manager/controller-manager-5d6b8dcf69-f7mfs" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.370057 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfwk4\" (UniqueName: \"kubernetes.io/projected/bf462527-46a1-4092-b3de-bef4e89e73a6-kube-api-access-gfwk4\") pod \"controller-manager-5d6b8dcf69-f7mfs\" (UID: \"bf462527-46a1-4092-b3de-bef4e89e73a6\") " pod="openshift-controller-manager/controller-manager-5d6b8dcf69-f7mfs" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.411075 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cllc6"] Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.431414 4752 patch_prober.go:28] interesting pod/router-default-5444994796-jsq6c container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 17:39:14 crc kubenswrapper[4752]: [-]has-synced failed: reason withheld Feb 27 17:39:14 crc kubenswrapper[4752]: [+]process-running ok Feb 27 17:39:14 crc kubenswrapper[4752]: healthz check failed Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.431470 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jsq6c" podUID="f3b9cef1-7930-44bf-9bc7-5e28f8282e4e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 17:39:14 crc kubenswrapper[4752]: W0227 17:39:14.450781 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod049892e0_329b_442b_b232_997ee454f9c6.slice/crio-f7869c4384e6a4b6cc9d05c573dccc56b8d062db6eab32dbf51fc5c3a3d70019 WatchSource:0}: Error finding container f7869c4384e6a4b6cc9d05c573dccc56b8d062db6eab32dbf51fc5c3a3d70019: Status 404 returned error can't find the container with id f7869c4384e6a4b6cc9d05c573dccc56b8d062db6eab32dbf51fc5c3a3d70019 Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.472283 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf462527-46a1-4092-b3de-bef4e89e73a6-config\") pod \"controller-manager-5d6b8dcf69-f7mfs\" (UID: \"bf462527-46a1-4092-b3de-bef4e89e73a6\") " pod="openshift-controller-manager/controller-manager-5d6b8dcf69-f7mfs" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.472360 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf462527-46a1-4092-b3de-bef4e89e73a6-serving-cert\") pod \"controller-manager-5d6b8dcf69-f7mfs\" (UID: \"bf462527-46a1-4092-b3de-bef4e89e73a6\") " pod="openshift-controller-manager/controller-manager-5d6b8dcf69-f7mfs" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.472388 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf462527-46a1-4092-b3de-bef4e89e73a6-client-ca\") pod \"controller-manager-5d6b8dcf69-f7mfs\" (UID: \"bf462527-46a1-4092-b3de-bef4e89e73a6\") " pod="openshift-controller-manager/controller-manager-5d6b8dcf69-f7mfs" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.477349 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfwk4\" (UniqueName: \"kubernetes.io/projected/bf462527-46a1-4092-b3de-bef4e89e73a6-kube-api-access-gfwk4\") pod \"controller-manager-5d6b8dcf69-f7mfs\" (UID: \"bf462527-46a1-4092-b3de-bef4e89e73a6\") " pod="openshift-controller-manager/controller-manager-5d6b8dcf69-f7mfs" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.477443 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bf462527-46a1-4092-b3de-bef4e89e73a6-proxy-ca-bundles\") pod \"controller-manager-5d6b8dcf69-f7mfs\" (UID: \"bf462527-46a1-4092-b3de-bef4e89e73a6\") " pod="openshift-controller-manager/controller-manager-5d6b8dcf69-f7mfs" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.478383 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf462527-46a1-4092-b3de-bef4e89e73a6-config\") pod \"controller-manager-5d6b8dcf69-f7mfs\" (UID: \"bf462527-46a1-4092-b3de-bef4e89e73a6\") " pod="openshift-controller-manager/controller-manager-5d6b8dcf69-f7mfs" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.479028 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf462527-46a1-4092-b3de-bef4e89e73a6-client-ca\") pod \"controller-manager-5d6b8dcf69-f7mfs\" (UID: \"bf462527-46a1-4092-b3de-bef4e89e73a6\") " pod="openshift-controller-manager/controller-manager-5d6b8dcf69-f7mfs" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.484200 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf462527-46a1-4092-b3de-bef4e89e73a6-serving-cert\") pod \"controller-manager-5d6b8dcf69-f7mfs\" (UID: \"bf462527-46a1-4092-b3de-bef4e89e73a6\") " pod="openshift-controller-manager/controller-manager-5d6b8dcf69-f7mfs" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.486445 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bf462527-46a1-4092-b3de-bef4e89e73a6-proxy-ca-bundles\") pod \"controller-manager-5d6b8dcf69-f7mfs\" (UID: \"bf462527-46a1-4092-b3de-bef4e89e73a6\") " pod="openshift-controller-manager/controller-manager-5d6b8dcf69-f7mfs" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.537948 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfwk4\" (UniqueName: \"kubernetes.io/projected/bf462527-46a1-4092-b3de-bef4e89e73a6-kube-api-access-gfwk4\") pod \"controller-manager-5d6b8dcf69-f7mfs\" (UID: \"bf462527-46a1-4092-b3de-bef4e89e73a6\") " pod="openshift-controller-manager/controller-manager-5d6b8dcf69-f7mfs" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.638607 4752 generic.go:334] "Generic (PLEG): container finished" podID="760298d8-7405-4c9e-b322-b08dbc182da8" containerID="fa128acf65ea08e40b355cfdea643b4a3d524d680578e4d277243d36ce63dc01" exitCode=0 Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.638922 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zhhxr" event={"ID":"760298d8-7405-4c9e-b322-b08dbc182da8","Type":"ContainerDied","Data":"fa128acf65ea08e40b355cfdea643b4a3d524d680578e4d277243d36ce63dc01"} Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.638946 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zhhxr" event={"ID":"760298d8-7405-4c9e-b322-b08dbc182da8","Type":"ContainerStarted","Data":"d4deb7355bc7c0c217ad810f134a92b0cfaa53583b26a0ed7bcb975e092a45a8"} Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.645518 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84fb987488-h86k2" event={"ID":"3297aa73-488b-4cac-bb32-defff1256ff3","Type":"ContainerStarted","Data":"91bfbca401ce5f8258b044a03f1cacfa05c367b82c3166f6ac005d94207f35d9"} Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.645558 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84fb987488-h86k2" event={"ID":"3297aa73-488b-4cac-bb32-defff1256ff3","Type":"ContainerStarted","Data":"52c64f025ed5bbfd25b6c7747b48b46b768234895d9c37fa67cbe15931ecdfe9"} Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.646365 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-84fb987488-h86k2" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.669290 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cllc6" event={"ID":"049892e0-329b-442b-b232-997ee454f9c6","Type":"ContainerStarted","Data":"f7869c4384e6a4b6cc9d05c573dccc56b8d062db6eab32dbf51fc5c3a3d70019"} Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.683646 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d6b8dcf69-f7mfs" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.689006 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536890-gjpc2" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.689907 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-84fb987488-h86k2" podStartSLOduration=3.689887411 podStartE2EDuration="3.689887411s" podCreationTimestamp="2026-02-27 17:39:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:14.68659698 +0000 UTC m=+254.593413831" watchObservedRunningTime="2026-02-27 17:39:14.689887411 +0000 UTC m=+254.596704262" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.690753 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29536890-gjpc2" event={"ID":"1dcb36df-1b47-47d4-933c-24498112a4a6","Type":"ContainerDied","Data":"f901752c73e458d6369375c0efe49bea79c65595c6ea8362e4a99248c2d872e5"} Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.690781 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f901752c73e458d6369375c0efe49bea79c65595c6ea8362e4a99248c2d872e5" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.707054 4752 generic.go:334] "Generic (PLEG): container finished" podID="8f3fef0c-3415-427e-9ebc-407d820a732c" containerID="e227ae0154f485a1f7d9fe91f6dc131ea06c2fed713062a973163e9f92a7d28a" exitCode=0 Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.707174 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8f3fef0c-3415-427e-9ebc-407d820a732c","Type":"ContainerDied","Data":"e227ae0154f485a1f7d9fe91f6dc131ea06c2fed713062a973163e9f92a7d28a"} Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.713332 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" event={"ID":"57573690-e945-43f5-b3ed-e3451f5a8a47","Type":"ContainerStarted","Data":"2a00d9700aac7ca41290f95c8c78dedc3add9ca04dc4b893d762f6a1d1026e12"} Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.713365 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" event={"ID":"57573690-e945-43f5-b3ed-e3451f5a8a47","Type":"ContainerStarted","Data":"637a6aaff9e442b73c8adbb7bedb8cae14905e19dd33712c20caabc6b8f0f12f"} Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.713585 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.715589 4752 generic.go:334] "Generic (PLEG): container finished" podID="137184c7-4f82-4685-89fa-d5152358e216" containerID="2df76a3f751acf725660d64460266ee916899aa32644c9d16606b37d321a7d4b" exitCode=0 Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.718493 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qvj4t" event={"ID":"137184c7-4f82-4685-89fa-d5152358e216","Type":"ContainerDied","Data":"2df76a3f751acf725660d64460266ee916899aa32644c9d16606b37d321a7d4b"} Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.718529 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qvj4t" event={"ID":"137184c7-4f82-4685-89fa-d5152358e216","Type":"ContainerStarted","Data":"b71ab18bb56ca20e3c29d69e08973645a3cec4a658e437a266506e2cf59bf5b0"} Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.723718 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-hrbkp" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.746280 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" podStartSLOduration=198.746259183 podStartE2EDuration="3m18.746259183s" podCreationTimestamp="2026-02-27 17:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:14.743378612 +0000 UTC m=+254.650195463" watchObservedRunningTime="2026-02-27 17:39:14.746259183 +0000 UTC m=+254.653076034" Feb 27 17:39:14 crc kubenswrapper[4752]: W0227 17:39:14.756704 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-4cdfc84a78b25ce1f2628aa89f1aaa87cc57ebb69f3e424cd28e6484d96de8f7 WatchSource:0}: Error finding container 4cdfc84a78b25ce1f2628aa89f1aaa87cc57ebb69f3e424cd28e6484d96de8f7: Status 404 returned error can't find the container with id 4cdfc84a78b25ce1f2628aa89f1aaa87cc57ebb69f3e424cd28e6484d96de8f7 Feb 27 17:39:14 crc kubenswrapper[4752]: E0227 17:39:14.766470 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-t8w4t" podUID="ebdbb722-11b5-43c4-b8dc-8758bbc7164c" Feb 27 17:39:14 crc kubenswrapper[4752]: E0227 17:39:14.774777 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-694vw" podUID="2db85625-5324-4606-a2f1-740416e8d218" Feb 27 17:39:14 crc kubenswrapper[4752]: E0227 17:39:14.774898 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-b7x2z" podUID="899d1101-b4de-4326-b442-6450903b2a30" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.776113 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-zj6td" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.776195 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-zj6td" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.805092 4752 patch_prober.go:28] interesting pod/console-f9d7485db-zj6td container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.27:8443/health\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.805247 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-zj6td" podUID="2d1dabd3-4307-468d-86d9-01a1ac2e3539" containerName="console" probeResult="failure" output="Get \"https://10.217.0.27:8443/health\": dial tcp 10.217.0.27:8443: connect: connection refused" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.955942 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 27 17:39:14 crc kubenswrapper[4752]: I0227 17:39:14.989561 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jkjwj"] Feb 27 17:39:15 crc kubenswrapper[4752]: I0227 17:39:15.088131 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-84fb987488-h86k2" Feb 27 17:39:15 crc kubenswrapper[4752]: W0227 17:39:15.091047 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-3880b96511896347f34a8caba6ee09ef04f35ae9db8eb31526ab98c4af9d40bd WatchSource:0}: Error finding container 3880b96511896347f34a8caba6ee09ef04f35ae9db8eb31526ab98c4af9d40bd: Status 404 returned error can't find the container with id 3880b96511896347f34a8caba6ee09ef04f35ae9db8eb31526ab98c4af9d40bd Feb 27 17:39:15 crc kubenswrapper[4752]: I0227 17:39:15.106135 4752 patch_prober.go:28] interesting pod/downloads-7954f5f757-cfp2v container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Feb 27 17:39:15 crc kubenswrapper[4752]: I0227 17:39:15.106185 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-cfp2v" podUID="3083b21b-220e-4439-a3c1-18c79f073151" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Feb 27 17:39:15 crc kubenswrapper[4752]: I0227 17:39:15.106155 4752 patch_prober.go:28] interesting pod/downloads-7954f5f757-cfp2v container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Feb 27 17:39:15 crc kubenswrapper[4752]: I0227 17:39:15.106445 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-cfp2v" podUID="3083b21b-220e-4439-a3c1-18c79f073151" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Feb 27 17:39:15 crc kubenswrapper[4752]: I0227 17:39:15.257334 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 17:39:15 crc kubenswrapper[4752]: I0227 17:39:15.299033 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f3fef0c-3415-427e-9ebc-407d820a732c-kube-api-access\") pod \"8f3fef0c-3415-427e-9ebc-407d820a732c\" (UID: \"8f3fef0c-3415-427e-9ebc-407d820a732c\") " Feb 27 17:39:15 crc kubenswrapper[4752]: I0227 17:39:15.299414 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8f3fef0c-3415-427e-9ebc-407d820a732c-kubelet-dir\") pod \"8f3fef0c-3415-427e-9ebc-407d820a732c\" (UID: \"8f3fef0c-3415-427e-9ebc-407d820a732c\") " Feb 27 17:39:15 crc kubenswrapper[4752]: I0227 17:39:15.299815 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f3fef0c-3415-427e-9ebc-407d820a732c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8f3fef0c-3415-427e-9ebc-407d820a732c" (UID: "8f3fef0c-3415-427e-9ebc-407d820a732c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 17:39:15 crc kubenswrapper[4752]: I0227 17:39:15.305189 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f3fef0c-3415-427e-9ebc-407d820a732c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8f3fef0c-3415-427e-9ebc-407d820a732c" (UID: "8f3fef0c-3415-427e-9ebc-407d820a732c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:39:15 crc kubenswrapper[4752]: I0227 17:39:15.402001 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f3fef0c-3415-427e-9ebc-407d820a732c-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 27 17:39:15 crc kubenswrapper[4752]: I0227 17:39:15.402045 4752 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8f3fef0c-3415-427e-9ebc-407d820a732c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 27 17:39:15 crc kubenswrapper[4752]: I0227 17:39:15.415403 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-jsq6c" Feb 27 17:39:15 crc kubenswrapper[4752]: I0227 17:39:15.418822 4752 patch_prober.go:28] interesting pod/router-default-5444994796-jsq6c container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 17:39:15 crc kubenswrapper[4752]: [-]has-synced failed: reason withheld Feb 27 17:39:15 crc kubenswrapper[4752]: [+]process-running ok Feb 27 17:39:15 crc kubenswrapper[4752]: healthz check failed Feb 27 17:39:15 crc kubenswrapper[4752]: I0227 17:39:15.418857 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jsq6c" podUID="f3b9cef1-7930-44bf-9bc7-5e28f8282e4e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 17:39:15 crc kubenswrapper[4752]: I0227 17:39:15.495992 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d6b8dcf69-f7mfs"] Feb 27 17:39:15 crc kubenswrapper[4752]: I0227 17:39:15.730095 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 27 17:39:15 crc kubenswrapper[4752]: I0227 17:39:15.730241 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8f3fef0c-3415-427e-9ebc-407d820a732c","Type":"ContainerDied","Data":"de9db09b6fc56ba8473b1a19748331ee452f9b2a5a820373953446db0790b673"} Feb 27 17:39:15 crc kubenswrapper[4752]: I0227 17:39:15.730833 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de9db09b6fc56ba8473b1a19748331ee452f9b2a5a820373953446db0790b673" Feb 27 17:39:15 crc kubenswrapper[4752]: I0227 17:39:15.732724 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d6b8dcf69-f7mfs" event={"ID":"bf462527-46a1-4092-b3de-bef4e89e73a6","Type":"ContainerStarted","Data":"4e9df117af9fe9519deb3bbec86abd4af0521b55710df1bce23fa6f6d48a452c"} Feb 27 17:39:15 crc kubenswrapper[4752]: I0227 17:39:15.732756 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d6b8dcf69-f7mfs" event={"ID":"bf462527-46a1-4092-b3de-bef4e89e73a6","Type":"ContainerStarted","Data":"dfe618dd6d6f84d5f56e8e6c3c222ab650a88e32332c20723f359cac4831ccf8"} Feb 27 17:39:15 crc kubenswrapper[4752]: I0227 17:39:15.733249 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5d6b8dcf69-f7mfs" Feb 27 17:39:15 crc kubenswrapper[4752]: I0227 17:39:15.739664 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"969c4cbb712d0e41d0f1b79a822ad158594d001ba2c8bad314ed38c731abf030"} Feb 27 17:39:15 crc kubenswrapper[4752]: I0227 17:39:15.739729 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"20877352d1f39bb9e01af19e97e5367c501ec47cce1327d3868f74e27f170af1"} Feb 27 17:39:15 crc kubenswrapper[4752]: I0227 17:39:15.741177 4752 patch_prober.go:28] interesting pod/controller-manager-5d6b8dcf69-f7mfs container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" start-of-body= Feb 27 17:39:15 crc kubenswrapper[4752]: I0227 17:39:15.741270 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5d6b8dcf69-f7mfs" podUID="bf462527-46a1-4092-b3de-bef4e89e73a6" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" Feb 27 17:39:15 crc kubenswrapper[4752]: I0227 17:39:15.743068 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"11d8f1456391f8477c7c86d20fac163dcb24a33a1e586b42aac370b1d9f678e2"} Feb 27 17:39:15 crc kubenswrapper[4752]: I0227 17:39:15.743093 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"4cdfc84a78b25ce1f2628aa89f1aaa87cc57ebb69f3e424cd28e6484d96de8f7"} Feb 27 17:39:15 crc kubenswrapper[4752]: I0227 17:39:15.743588 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:39:15 crc kubenswrapper[4752]: I0227 17:39:15.750454 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jkjwj" event={"ID":"937bbb35-a3c2-435c-86c5-1072f3a54595","Type":"ContainerStarted","Data":"1b9dee0d2dbc0b3396d0d331c359cab8a342e038e6104abe7c2fa17f861109b3"} Feb 27 17:39:15 crc kubenswrapper[4752]: I0227 17:39:15.750484 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jkjwj" event={"ID":"937bbb35-a3c2-435c-86c5-1072f3a54595","Type":"ContainerStarted","Data":"fa6ea644293b35f01cc1603631a549138cc2b8fef128de02b37a6ca01cf19858"} Feb 27 17:39:15 crc kubenswrapper[4752]: I0227 17:39:15.757458 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"29ca14a171db984e0c708dbd2fbf5e509382dae8dddc88f4e7aea0a6ab48f071"} Feb 27 17:39:15 crc kubenswrapper[4752]: I0227 17:39:15.757507 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"3880b96511896347f34a8caba6ee09ef04f35ae9db8eb31526ab98c4af9d40bd"} Feb 27 17:39:15 crc kubenswrapper[4752]: I0227 17:39:15.758124 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5d6b8dcf69-f7mfs" podStartSLOduration=6.758112168 podStartE2EDuration="6.758112168s" podCreationTimestamp="2026-02-27 17:39:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:15.74968508 +0000 UTC m=+255.656501931" watchObservedRunningTime="2026-02-27 17:39:15.758112168 +0000 UTC m=+255.664929029" Feb 27 17:39:15 crc kubenswrapper[4752]: I0227 17:39:15.792394 4752 generic.go:334] "Generic (PLEG): container finished" podID="049892e0-329b-442b-b232-997ee454f9c6" containerID="0e9fd2e747fc409eb6f57eaaeda905137a4f650ca891f9ce18922e79bc2e2b23" exitCode=0 Feb 27 17:39:15 crc kubenswrapper[4752]: I0227 17:39:15.792470 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cllc6" event={"ID":"049892e0-329b-442b-b232-997ee454f9c6","Type":"ContainerDied","Data":"0e9fd2e747fc409eb6f57eaaeda905137a4f650ca891f9ce18922e79bc2e2b23"} Feb 27 17:39:16 crc kubenswrapper[4752]: I0227 17:39:16.416312 4752 patch_prober.go:28] interesting pod/router-default-5444994796-jsq6c container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 17:39:16 crc kubenswrapper[4752]: [-]has-synced failed: reason withheld Feb 27 17:39:16 crc kubenswrapper[4752]: [+]process-running ok Feb 27 17:39:16 crc kubenswrapper[4752]: healthz check failed Feb 27 17:39:16 crc kubenswrapper[4752]: I0227 17:39:16.416679 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jsq6c" podUID="f3b9cef1-7930-44bf-9bc7-5e28f8282e4e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 17:39:16 crc kubenswrapper[4752]: I0227 17:39:16.812851 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jkjwj" event={"ID":"937bbb35-a3c2-435c-86c5-1072f3a54595","Type":"ContainerStarted","Data":"157a9338ba997140c93ff46d3611338c9a8099c56f0b61a8f3c7d52d82f2cd99"} Feb 27 17:39:16 crc kubenswrapper[4752]: I0227 17:39:16.822343 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5d6b8dcf69-f7mfs" Feb 27 17:39:16 crc kubenswrapper[4752]: I0227 17:39:16.833530 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-jkjwj" podStartSLOduration=200.833516162 podStartE2EDuration="3m20.833516162s" podCreationTimestamp="2026-02-27 17:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:16.831449521 +0000 UTC m=+256.738266382" watchObservedRunningTime="2026-02-27 17:39:16.833516162 +0000 UTC m=+256.740333013" Feb 27 17:39:17 crc kubenswrapper[4752]: I0227 17:39:17.415731 4752 patch_prober.go:28] interesting pod/router-default-5444994796-jsq6c container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 17:39:17 crc kubenswrapper[4752]: [-]has-synced failed: reason withheld Feb 27 17:39:17 crc kubenswrapper[4752]: [+]process-running ok Feb 27 17:39:17 crc kubenswrapper[4752]: healthz check failed Feb 27 17:39:17 crc kubenswrapper[4752]: I0227 17:39:17.415776 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jsq6c" podUID="f3b9cef1-7930-44bf-9bc7-5e28f8282e4e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 17:39:17 crc kubenswrapper[4752]: I0227 17:39:17.509748 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 27 17:39:17 crc kubenswrapper[4752]: E0227 17:39:17.510222 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f3fef0c-3415-427e-9ebc-407d820a732c" containerName="pruner" Feb 27 17:39:17 crc kubenswrapper[4752]: I0227 17:39:17.510243 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f3fef0c-3415-427e-9ebc-407d820a732c" containerName="pruner" Feb 27 17:39:17 crc kubenswrapper[4752]: I0227 17:39:17.510398 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f3fef0c-3415-427e-9ebc-407d820a732c" containerName="pruner" Feb 27 17:39:17 crc kubenswrapper[4752]: I0227 17:39:17.510934 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 17:39:17 crc kubenswrapper[4752]: I0227 17:39:17.515377 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 27 17:39:17 crc kubenswrapper[4752]: I0227 17:39:17.515594 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 27 17:39:17 crc kubenswrapper[4752]: I0227 17:39:17.517284 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 27 17:39:17 crc kubenswrapper[4752]: I0227 17:39:17.547400 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2e5a7857-5608-4073-b2f0-72b3e4461c95-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"2e5a7857-5608-4073-b2f0-72b3e4461c95\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 17:39:17 crc kubenswrapper[4752]: I0227 17:39:17.547435 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2e5a7857-5608-4073-b2f0-72b3e4461c95-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"2e5a7857-5608-4073-b2f0-72b3e4461c95\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 17:39:17 crc kubenswrapper[4752]: I0227 17:39:17.648561 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2e5a7857-5608-4073-b2f0-72b3e4461c95-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"2e5a7857-5608-4073-b2f0-72b3e4461c95\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 17:39:17 crc kubenswrapper[4752]: I0227 17:39:17.648615 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2e5a7857-5608-4073-b2f0-72b3e4461c95-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"2e5a7857-5608-4073-b2f0-72b3e4461c95\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 17:39:17 crc kubenswrapper[4752]: I0227 17:39:17.648992 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2e5a7857-5608-4073-b2f0-72b3e4461c95-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"2e5a7857-5608-4073-b2f0-72b3e4461c95\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 17:39:17 crc kubenswrapper[4752]: I0227 17:39:17.679566 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2e5a7857-5608-4073-b2f0-72b3e4461c95-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"2e5a7857-5608-4073-b2f0-72b3e4461c95\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 17:39:17 crc kubenswrapper[4752]: I0227 17:39:17.857395 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 17:39:17 crc kubenswrapper[4752]: I0227 17:39:17.897392 4752 ???:1] "http: TLS handshake error from 192.168.126.11:33744: no serving certificate available for the kubelet" Feb 27 17:39:18 crc kubenswrapper[4752]: I0227 17:39:18.042834 4752 ???:1] "http: TLS handshake error from 192.168.126.11:33752: no serving certificate available for the kubelet" Feb 27 17:39:18 crc kubenswrapper[4752]: I0227 17:39:18.358121 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 27 17:39:18 crc kubenswrapper[4752]: W0227 17:39:18.374393 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2e5a7857_5608_4073_b2f0_72b3e4461c95.slice/crio-c7cd2e8c60f5192ce5b19b9c99926968eeaaacad8a21693b428f99d9673c6433 WatchSource:0}: Error finding container c7cd2e8c60f5192ce5b19b9c99926968eeaaacad8a21693b428f99d9673c6433: Status 404 returned error can't find the container with id c7cd2e8c60f5192ce5b19b9c99926968eeaaacad8a21693b428f99d9673c6433 Feb 27 17:39:18 crc kubenswrapper[4752]: I0227 17:39:18.417580 4752 patch_prober.go:28] interesting pod/router-default-5444994796-jsq6c container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 17:39:18 crc kubenswrapper[4752]: [-]has-synced failed: reason withheld Feb 27 17:39:18 crc kubenswrapper[4752]: [+]process-running ok Feb 27 17:39:18 crc kubenswrapper[4752]: healthz check failed Feb 27 17:39:18 crc kubenswrapper[4752]: I0227 17:39:18.417678 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jsq6c" podUID="f3b9cef1-7930-44bf-9bc7-5e28f8282e4e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 17:39:18 crc kubenswrapper[4752]: I0227 17:39:18.845403 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"2e5a7857-5608-4073-b2f0-72b3e4461c95","Type":"ContainerStarted","Data":"c7cd2e8c60f5192ce5b19b9c99926968eeaaacad8a21693b428f99d9673c6433"} Feb 27 17:39:19 crc kubenswrapper[4752]: I0227 17:39:19.417923 4752 patch_prober.go:28] interesting pod/router-default-5444994796-jsq6c container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 17:39:19 crc kubenswrapper[4752]: [-]has-synced failed: reason withheld Feb 27 17:39:19 crc kubenswrapper[4752]: [+]process-running ok Feb 27 17:39:19 crc kubenswrapper[4752]: healthz check failed Feb 27 17:39:19 crc kubenswrapper[4752]: I0227 17:39:19.418236 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jsq6c" podUID="f3b9cef1-7930-44bf-9bc7-5e28f8282e4e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 17:39:19 crc kubenswrapper[4752]: I0227 17:39:19.861416 4752 generic.go:334] "Generic (PLEG): container finished" podID="2e5a7857-5608-4073-b2f0-72b3e4461c95" containerID="0e4ef85d42c264cb8a6a8e3235ecb4f61a44edbb5fc2e361a548cce22f06836b" exitCode=0 Feb 27 17:39:19 crc kubenswrapper[4752]: I0227 17:39:19.861489 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"2e5a7857-5608-4073-b2f0-72b3e4461c95","Type":"ContainerDied","Data":"0e4ef85d42c264cb8a6a8e3235ecb4f61a44edbb5fc2e361a548cce22f06836b"} Feb 27 17:39:20 crc kubenswrapper[4752]: I0227 17:39:20.416892 4752 patch_prober.go:28] interesting pod/router-default-5444994796-jsq6c container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 17:39:20 crc kubenswrapper[4752]: [-]has-synced failed: reason withheld Feb 27 17:39:20 crc kubenswrapper[4752]: [+]process-running ok Feb 27 17:39:20 crc kubenswrapper[4752]: healthz check failed Feb 27 17:39:20 crc kubenswrapper[4752]: I0227 17:39:20.417233 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jsq6c" podUID="f3b9cef1-7930-44bf-9bc7-5e28f8282e4e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 17:39:21 crc kubenswrapper[4752]: I0227 17:39:21.206119 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-rjvxc" Feb 27 17:39:21 crc kubenswrapper[4752]: I0227 17:39:21.415624 4752 patch_prober.go:28] interesting pod/router-default-5444994796-jsq6c container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 17:39:21 crc kubenswrapper[4752]: [-]has-synced failed: reason withheld Feb 27 17:39:21 crc kubenswrapper[4752]: [+]process-running ok Feb 27 17:39:21 crc kubenswrapper[4752]: healthz check failed Feb 27 17:39:21 crc kubenswrapper[4752]: I0227 17:39:21.415678 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jsq6c" podUID="f3b9cef1-7930-44bf-9bc7-5e28f8282e4e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 17:39:22 crc kubenswrapper[4752]: I0227 17:39:22.416093 4752 patch_prober.go:28] interesting pod/router-default-5444994796-jsq6c container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 17:39:22 crc kubenswrapper[4752]: [-]has-synced failed: reason withheld Feb 27 17:39:22 crc kubenswrapper[4752]: [+]process-running ok Feb 27 17:39:22 crc kubenswrapper[4752]: healthz check failed Feb 27 17:39:22 crc kubenswrapper[4752]: I0227 17:39:22.416564 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jsq6c" podUID="f3b9cef1-7930-44bf-9bc7-5e28f8282e4e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 17:39:23 crc kubenswrapper[4752]: I0227 17:39:23.415597 4752 patch_prober.go:28] interesting pod/router-default-5444994796-jsq6c container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 27 17:39:23 crc kubenswrapper[4752]: [-]has-synced failed: reason withheld Feb 27 17:39:23 crc kubenswrapper[4752]: [+]process-running ok Feb 27 17:39:23 crc kubenswrapper[4752]: healthz check failed Feb 27 17:39:23 crc kubenswrapper[4752]: I0227 17:39:23.415685 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jsq6c" podUID="f3b9cef1-7930-44bf-9bc7-5e28f8282e4e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 27 17:39:24 crc kubenswrapper[4752]: I0227 17:39:24.417287 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-jsq6c" Feb 27 17:39:24 crc kubenswrapper[4752]: I0227 17:39:24.419730 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-jsq6c" Feb 27 17:39:24 crc kubenswrapper[4752]: I0227 17:39:24.782394 4752 patch_prober.go:28] interesting pod/console-f9d7485db-zj6td container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.27:8443/health\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Feb 27 17:39:24 crc kubenswrapper[4752]: I0227 17:39:24.782693 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-zj6td" podUID="2d1dabd3-4307-468d-86d9-01a1ac2e3539" containerName="console" probeResult="failure" output="Get \"https://10.217.0.27:8443/health\": dial tcp 10.217.0.27:8443: connect: connection refused" Feb 27 17:39:25 crc kubenswrapper[4752]: I0227 17:39:25.132299 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-cfp2v" Feb 27 17:39:25 crc kubenswrapper[4752]: I0227 17:39:25.243180 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 17:39:25 crc kubenswrapper[4752]: I0227 17:39:25.428781 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2e5a7857-5608-4073-b2f0-72b3e4461c95-kubelet-dir\") pod \"2e5a7857-5608-4073-b2f0-72b3e4461c95\" (UID: \"2e5a7857-5608-4073-b2f0-72b3e4461c95\") " Feb 27 17:39:25 crc kubenswrapper[4752]: I0227 17:39:25.428847 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2e5a7857-5608-4073-b2f0-72b3e4461c95-kube-api-access\") pod \"2e5a7857-5608-4073-b2f0-72b3e4461c95\" (UID: \"2e5a7857-5608-4073-b2f0-72b3e4461c95\") " Feb 27 17:39:25 crc kubenswrapper[4752]: I0227 17:39:25.429089 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e5a7857-5608-4073-b2f0-72b3e4461c95-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "2e5a7857-5608-4073-b2f0-72b3e4461c95" (UID: "2e5a7857-5608-4073-b2f0-72b3e4461c95"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 17:39:25 crc kubenswrapper[4752]: I0227 17:39:25.430441 4752 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2e5a7857-5608-4073-b2f0-72b3e4461c95-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 27 17:39:25 crc kubenswrapper[4752]: I0227 17:39:25.434986 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e5a7857-5608-4073-b2f0-72b3e4461c95-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "2e5a7857-5608-4073-b2f0-72b3e4461c95" (UID: "2e5a7857-5608-4073-b2f0-72b3e4461c95"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:39:25 crc kubenswrapper[4752]: I0227 17:39:25.532193 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2e5a7857-5608-4073-b2f0-72b3e4461c95-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 27 17:39:25 crc kubenswrapper[4752]: I0227 17:39:25.921758 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"2e5a7857-5608-4073-b2f0-72b3e4461c95","Type":"ContainerDied","Data":"c7cd2e8c60f5192ce5b19b9c99926968eeaaacad8a21693b428f99d9673c6433"} Feb 27 17:39:25 crc kubenswrapper[4752]: I0227 17:39:25.921805 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7cd2e8c60f5192ce5b19b9c99926968eeaaacad8a21693b428f99d9673c6433" Feb 27 17:39:25 crc kubenswrapper[4752]: I0227 17:39:25.921809 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 27 17:39:28 crc kubenswrapper[4752]: I0227 17:39:28.160255 4752 ???:1] "http: TLS handshake error from 192.168.126.11:46990: no serving certificate available for the kubelet" Feb 27 17:39:28 crc kubenswrapper[4752]: I0227 17:39:28.463641 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d6b8dcf69-f7mfs"] Feb 27 17:39:28 crc kubenswrapper[4752]: I0227 17:39:28.464124 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5d6b8dcf69-f7mfs" podUID="bf462527-46a1-4092-b3de-bef4e89e73a6" containerName="controller-manager" containerID="cri-o://4e9df117af9fe9519deb3bbec86abd4af0521b55710df1bce23fa6f6d48a452c" gracePeriod=30 Feb 27 17:39:28 crc kubenswrapper[4752]: I0227 17:39:28.502845 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84fb987488-h86k2"] Feb 27 17:39:28 crc kubenswrapper[4752]: I0227 17:39:28.503368 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-84fb987488-h86k2" podUID="3297aa73-488b-4cac-bb32-defff1256ff3" containerName="route-controller-manager" containerID="cri-o://91bfbca401ce5f8258b044a03f1cacfa05c367b82c3166f6ac005d94207f35d9" gracePeriod=30 Feb 27 17:39:29 crc kubenswrapper[4752]: I0227 17:39:29.954045 4752 generic.go:334] "Generic (PLEG): container finished" podID="bf462527-46a1-4092-b3de-bef4e89e73a6" containerID="4e9df117af9fe9519deb3bbec86abd4af0521b55710df1bce23fa6f6d48a452c" exitCode=0 Feb 27 17:39:29 crc kubenswrapper[4752]: I0227 17:39:29.954207 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d6b8dcf69-f7mfs" event={"ID":"bf462527-46a1-4092-b3de-bef4e89e73a6","Type":"ContainerDied","Data":"4e9df117af9fe9519deb3bbec86abd4af0521b55710df1bce23fa6f6d48a452c"} Feb 27 17:39:29 crc kubenswrapper[4752]: I0227 17:39:29.960749 4752 generic.go:334] "Generic (PLEG): container finished" podID="3297aa73-488b-4cac-bb32-defff1256ff3" containerID="91bfbca401ce5f8258b044a03f1cacfa05c367b82c3166f6ac005d94207f35d9" exitCode=0 Feb 27 17:39:29 crc kubenswrapper[4752]: I0227 17:39:29.960805 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84fb987488-h86k2" event={"ID":"3297aa73-488b-4cac-bb32-defff1256ff3","Type":"ContainerDied","Data":"91bfbca401ce5f8258b044a03f1cacfa05c367b82c3166f6ac005d94207f35d9"} Feb 27 17:39:30 crc kubenswrapper[4752]: E0227 17:39:30.727127 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/community-operator-index@sha256=886ecdbcb5b8f90338063f6476072fab73c2a9a65b9f2b3835b7bd01c69794c1/signature-2: status 500 (Internal Server Error)" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 27 17:39:30 crc kubenswrapper[4752]: E0227 17:39:30.727348 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wrvgh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-b7x2z_openshift-marketplace(899d1101-b4de-4326-b442-6450903b2a30): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/community-operator-index@sha256=886ecdbcb5b8f90338063f6476072fab73c2a9a65b9f2b3835b7bd01c69794c1/signature-2: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 17:39:30 crc kubenswrapper[4752]: E0227 17:39:30.728615 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/community-operator-index@sha256=886ecdbcb5b8f90338063f6476072fab73c2a9a65b9f2b3835b7bd01c69794c1/signature-2: status 500 (Internal Server Error)\"" pod="openshift-marketplace/community-operators-b7x2z" podUID="899d1101-b4de-4326-b442-6450903b2a30" Feb 27 17:39:30 crc kubenswrapper[4752]: E0227 17:39:30.772677 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/certified-operator-index@sha256=625372062485d8ed1e4e84c388a7d036cb39c1b93d8c56dd3418fce0c028b62b/signature-2: status 500 (Internal Server Error)" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 27 17:39:30 crc kubenswrapper[4752]: E0227 17:39:30.773106 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rl86p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-t8w4t_openshift-marketplace(ebdbb722-11b5-43c4-b8dc-8758bbc7164c): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/certified-operator-index@sha256=625372062485d8ed1e4e84c388a7d036cb39c1b93d8c56dd3418fce0c028b62b/signature-2: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 17:39:30 crc kubenswrapper[4752]: E0227 17:39:30.774422 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/certified-operator-index@sha256=625372062485d8ed1e4e84c388a7d036cb39c1b93d8c56dd3418fce0c028b62b/signature-2: status 500 (Internal Server Error)\"" pod="openshift-marketplace/certified-operators-t8w4t" podUID="ebdbb722-11b5-43c4-b8dc-8758bbc7164c" Feb 27 17:39:30 crc kubenswrapper[4752]: E0227 17:39:30.907923 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/certified-operator-index@sha256=625372062485d8ed1e4e84c388a7d036cb39c1b93d8c56dd3418fce0c028b62b/signature-2: status 500 (Internal Server Error)" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 27 17:39:30 crc kubenswrapper[4752]: E0227 17:39:30.908079 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sjlzt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-694vw_openshift-marketplace(2db85625-5324-4606-a2f1-740416e8d218): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/certified-operator-index@sha256=625372062485d8ed1e4e84c388a7d036cb39c1b93d8c56dd3418fce0c028b62b/signature-2: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 17:39:30 crc kubenswrapper[4752]: E0227 17:39:30.909568 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/certified-operator-index@sha256=625372062485d8ed1e4e84c388a7d036cb39c1b93d8c56dd3418fce0c028b62b/signature-2: status 500 (Internal Server Error)\"" pod="openshift-marketplace/certified-operators-694vw" podUID="2db85625-5324-4606-a2f1-740416e8d218" Feb 27 17:39:33 crc kubenswrapper[4752]: I0227 17:39:33.826396 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:39:33 crc kubenswrapper[4752]: I0227 17:39:33.949437 4752 patch_prober.go:28] interesting pod/route-controller-manager-84fb987488-h86k2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" start-of-body= Feb 27 17:39:33 crc kubenswrapper[4752]: I0227 17:39:33.949491 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-84fb987488-h86k2" podUID="3297aa73-488b-4cac-bb32-defff1256ff3" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" Feb 27 17:39:34 crc kubenswrapper[4752]: I0227 17:39:34.685631 4752 patch_prober.go:28] interesting pod/controller-manager-5d6b8dcf69-f7mfs container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" start-of-body= Feb 27 17:39:34 crc kubenswrapper[4752]: I0227 17:39:34.686316 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5d6b8dcf69-f7mfs" podUID="bf462527-46a1-4092-b3de-bef4e89e73a6" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" Feb 27 17:39:34 crc kubenswrapper[4752]: I0227 17:39:34.784580 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-zj6td" Feb 27 17:39:34 crc kubenswrapper[4752]: I0227 17:39:34.791268 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-zj6td" Feb 27 17:39:35 crc kubenswrapper[4752]: E0227 17:39:35.138637 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 27 17:39:35 crc kubenswrapper[4752]: E0227 17:39:35.138776 4752 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 17:39:35 crc kubenswrapper[4752]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 27 17:39:35 crc kubenswrapper[4752]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-r9djm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29536898-598km_openshift-infra(cc36acda-9447-479d-b741-c063ecb91f3e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Feb 27 17:39:35 crc kubenswrapper[4752]: > logger="UnhandledError" Feb 27 17:39:35 crc kubenswrapper[4752]: E0227 17:39:35.140388 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29536898-598km" podUID="cc36acda-9447-479d-b741-c063ecb91f3e" Feb 27 17:39:35 crc kubenswrapper[4752]: E0227 17:39:35.994830 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536898-598km" podUID="cc36acda-9447-479d-b741-c063ecb91f3e" Feb 27 17:39:36 crc kubenswrapper[4752]: I0227 17:39:36.323344 4752 patch_prober.go:28] interesting pod/machine-config-daemon-cm8wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 17:39:36 crc kubenswrapper[4752]: I0227 17:39:36.323719 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 17:39:37 crc kubenswrapper[4752]: I0227 17:39:37.754507 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84fb987488-h86k2" Feb 27 17:39:37 crc kubenswrapper[4752]: I0227 17:39:37.762453 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d6b8dcf69-f7mfs" Feb 27 17:39:37 crc kubenswrapper[4752]: I0227 17:39:37.790199 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b9f458779-wkmbv"] Feb 27 17:39:37 crc kubenswrapper[4752]: E0227 17:39:37.790436 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e5a7857-5608-4073-b2f0-72b3e4461c95" containerName="pruner" Feb 27 17:39:37 crc kubenswrapper[4752]: I0227 17:39:37.790450 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e5a7857-5608-4073-b2f0-72b3e4461c95" containerName="pruner" Feb 27 17:39:37 crc kubenswrapper[4752]: E0227 17:39:37.790473 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3297aa73-488b-4cac-bb32-defff1256ff3" containerName="route-controller-manager" Feb 27 17:39:37 crc kubenswrapper[4752]: I0227 17:39:37.790483 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="3297aa73-488b-4cac-bb32-defff1256ff3" containerName="route-controller-manager" Feb 27 17:39:37 crc kubenswrapper[4752]: E0227 17:39:37.790499 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf462527-46a1-4092-b3de-bef4e89e73a6" containerName="controller-manager" Feb 27 17:39:37 crc kubenswrapper[4752]: I0227 17:39:37.790508 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf462527-46a1-4092-b3de-bef4e89e73a6" containerName="controller-manager" Feb 27 17:39:37 crc kubenswrapper[4752]: I0227 17:39:37.790640 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf462527-46a1-4092-b3de-bef4e89e73a6" containerName="controller-manager" Feb 27 17:39:37 crc kubenswrapper[4752]: I0227 17:39:37.790657 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e5a7857-5608-4073-b2f0-72b3e4461c95" containerName="pruner" Feb 27 17:39:37 crc kubenswrapper[4752]: I0227 17:39:37.790670 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="3297aa73-488b-4cac-bb32-defff1256ff3" containerName="route-controller-manager" Feb 27 17:39:37 crc kubenswrapper[4752]: I0227 17:39:37.791103 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b9f458779-wkmbv" Feb 27 17:39:37 crc kubenswrapper[4752]: I0227 17:39:37.825312 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b9f458779-wkmbv"] Feb 27 17:39:37 crc kubenswrapper[4752]: I0227 17:39:37.939828 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf462527-46a1-4092-b3de-bef4e89e73a6-client-ca\") pod \"bf462527-46a1-4092-b3de-bef4e89e73a6\" (UID: \"bf462527-46a1-4092-b3de-bef4e89e73a6\") " Feb 27 17:39:37 crc kubenswrapper[4752]: I0227 17:39:37.939912 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3297aa73-488b-4cac-bb32-defff1256ff3-client-ca\") pod \"3297aa73-488b-4cac-bb32-defff1256ff3\" (UID: \"3297aa73-488b-4cac-bb32-defff1256ff3\") " Feb 27 17:39:37 crc kubenswrapper[4752]: I0227 17:39:37.939954 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfwk4\" (UniqueName: \"kubernetes.io/projected/bf462527-46a1-4092-b3de-bef4e89e73a6-kube-api-access-gfwk4\") pod \"bf462527-46a1-4092-b3de-bef4e89e73a6\" (UID: \"bf462527-46a1-4092-b3de-bef4e89e73a6\") " Feb 27 17:39:37 crc kubenswrapper[4752]: I0227 17:39:37.940025 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3297aa73-488b-4cac-bb32-defff1256ff3-serving-cert\") pod \"3297aa73-488b-4cac-bb32-defff1256ff3\" (UID: \"3297aa73-488b-4cac-bb32-defff1256ff3\") " Feb 27 17:39:37 crc kubenswrapper[4752]: I0227 17:39:37.940090 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3297aa73-488b-4cac-bb32-defff1256ff3-config\") pod \"3297aa73-488b-4cac-bb32-defff1256ff3\" (UID: \"3297aa73-488b-4cac-bb32-defff1256ff3\") " Feb 27 17:39:37 crc kubenswrapper[4752]: I0227 17:39:37.940213 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf462527-46a1-4092-b3de-bef4e89e73a6-serving-cert\") pod \"bf462527-46a1-4092-b3de-bef4e89e73a6\" (UID: \"bf462527-46a1-4092-b3de-bef4e89e73a6\") " Feb 27 17:39:37 crc kubenswrapper[4752]: I0227 17:39:37.940281 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf462527-46a1-4092-b3de-bef4e89e73a6-config\") pod \"bf462527-46a1-4092-b3de-bef4e89e73a6\" (UID: \"bf462527-46a1-4092-b3de-bef4e89e73a6\") " Feb 27 17:39:37 crc kubenswrapper[4752]: I0227 17:39:37.940337 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xczj\" (UniqueName: \"kubernetes.io/projected/3297aa73-488b-4cac-bb32-defff1256ff3-kube-api-access-8xczj\") pod \"3297aa73-488b-4cac-bb32-defff1256ff3\" (UID: \"3297aa73-488b-4cac-bb32-defff1256ff3\") " Feb 27 17:39:37 crc kubenswrapper[4752]: I0227 17:39:37.940369 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bf462527-46a1-4092-b3de-bef4e89e73a6-proxy-ca-bundles\") pod \"bf462527-46a1-4092-b3de-bef4e89e73a6\" (UID: \"bf462527-46a1-4092-b3de-bef4e89e73a6\") " Feb 27 17:39:37 crc kubenswrapper[4752]: I0227 17:39:37.940601 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68d2e603-f5c7-43b0-a311-adfb01551211-config\") pod \"route-controller-manager-7b9f458779-wkmbv\" (UID: \"68d2e603-f5c7-43b0-a311-adfb01551211\") " pod="openshift-route-controller-manager/route-controller-manager-7b9f458779-wkmbv" Feb 27 17:39:37 crc kubenswrapper[4752]: I0227 17:39:37.940657 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxkfw\" (UniqueName: \"kubernetes.io/projected/68d2e603-f5c7-43b0-a311-adfb01551211-kube-api-access-pxkfw\") pod \"route-controller-manager-7b9f458779-wkmbv\" (UID: \"68d2e603-f5c7-43b0-a311-adfb01551211\") " pod="openshift-route-controller-manager/route-controller-manager-7b9f458779-wkmbv" Feb 27 17:39:37 crc kubenswrapper[4752]: I0227 17:39:37.940720 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68d2e603-f5c7-43b0-a311-adfb01551211-serving-cert\") pod \"route-controller-manager-7b9f458779-wkmbv\" (UID: \"68d2e603-f5c7-43b0-a311-adfb01551211\") " pod="openshift-route-controller-manager/route-controller-manager-7b9f458779-wkmbv" Feb 27 17:39:37 crc kubenswrapper[4752]: I0227 17:39:37.940725 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf462527-46a1-4092-b3de-bef4e89e73a6-client-ca" (OuterVolumeSpecName: "client-ca") pod "bf462527-46a1-4092-b3de-bef4e89e73a6" (UID: "bf462527-46a1-4092-b3de-bef4e89e73a6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:39:37 crc kubenswrapper[4752]: I0227 17:39:37.940766 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3297aa73-488b-4cac-bb32-defff1256ff3-client-ca" (OuterVolumeSpecName: "client-ca") pod "3297aa73-488b-4cac-bb32-defff1256ff3" (UID: "3297aa73-488b-4cac-bb32-defff1256ff3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:39:37 crc kubenswrapper[4752]: I0227 17:39:37.940803 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/68d2e603-f5c7-43b0-a311-adfb01551211-client-ca\") pod \"route-controller-manager-7b9f458779-wkmbv\" (UID: \"68d2e603-f5c7-43b0-a311-adfb01551211\") " pod="openshift-route-controller-manager/route-controller-manager-7b9f458779-wkmbv" Feb 27 17:39:37 crc kubenswrapper[4752]: I0227 17:39:37.940935 4752 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf462527-46a1-4092-b3de-bef4e89e73a6-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 17:39:37 crc kubenswrapper[4752]: I0227 17:39:37.940947 4752 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3297aa73-488b-4cac-bb32-defff1256ff3-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 17:39:37 crc kubenswrapper[4752]: I0227 17:39:37.941295 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf462527-46a1-4092-b3de-bef4e89e73a6-config" (OuterVolumeSpecName: "config") pod "bf462527-46a1-4092-b3de-bef4e89e73a6" (UID: "bf462527-46a1-4092-b3de-bef4e89e73a6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:39:37 crc kubenswrapper[4752]: I0227 17:39:37.941616 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf462527-46a1-4092-b3de-bef4e89e73a6-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "bf462527-46a1-4092-b3de-bef4e89e73a6" (UID: "bf462527-46a1-4092-b3de-bef4e89e73a6"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:39:37 crc kubenswrapper[4752]: I0227 17:39:37.941886 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3297aa73-488b-4cac-bb32-defff1256ff3-config" (OuterVolumeSpecName: "config") pod "3297aa73-488b-4cac-bb32-defff1256ff3" (UID: "3297aa73-488b-4cac-bb32-defff1256ff3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:39:37 crc kubenswrapper[4752]: I0227 17:39:37.958433 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3297aa73-488b-4cac-bb32-defff1256ff3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3297aa73-488b-4cac-bb32-defff1256ff3" (UID: "3297aa73-488b-4cac-bb32-defff1256ff3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:39:37 crc kubenswrapper[4752]: I0227 17:39:37.958474 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3297aa73-488b-4cac-bb32-defff1256ff3-kube-api-access-8xczj" (OuterVolumeSpecName: "kube-api-access-8xczj") pod "3297aa73-488b-4cac-bb32-defff1256ff3" (UID: "3297aa73-488b-4cac-bb32-defff1256ff3"). InnerVolumeSpecName "kube-api-access-8xczj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:39:37 crc kubenswrapper[4752]: I0227 17:39:37.958483 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf462527-46a1-4092-b3de-bef4e89e73a6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bf462527-46a1-4092-b3de-bef4e89e73a6" (UID: "bf462527-46a1-4092-b3de-bef4e89e73a6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:39:37 crc kubenswrapper[4752]: I0227 17:39:37.958527 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf462527-46a1-4092-b3de-bef4e89e73a6-kube-api-access-gfwk4" (OuterVolumeSpecName: "kube-api-access-gfwk4") pod "bf462527-46a1-4092-b3de-bef4e89e73a6" (UID: "bf462527-46a1-4092-b3de-bef4e89e73a6"). InnerVolumeSpecName "kube-api-access-gfwk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:39:38 crc kubenswrapper[4752]: I0227 17:39:38.005430 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84fb987488-h86k2" event={"ID":"3297aa73-488b-4cac-bb32-defff1256ff3","Type":"ContainerDied","Data":"52c64f025ed5bbfd25b6c7747b48b46b768234895d9c37fa67cbe15931ecdfe9"} Feb 27 17:39:38 crc kubenswrapper[4752]: I0227 17:39:38.005474 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84fb987488-h86k2" Feb 27 17:39:38 crc kubenswrapper[4752]: I0227 17:39:38.005481 4752 scope.go:117] "RemoveContainer" containerID="91bfbca401ce5f8258b044a03f1cacfa05c367b82c3166f6ac005d94207f35d9" Feb 27 17:39:38 crc kubenswrapper[4752]: I0227 17:39:38.007676 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d6b8dcf69-f7mfs" event={"ID":"bf462527-46a1-4092-b3de-bef4e89e73a6","Type":"ContainerDied","Data":"dfe618dd6d6f84d5f56e8e6c3c222ab650a88e32332c20723f359cac4831ccf8"} Feb 27 17:39:38 crc kubenswrapper[4752]: I0227 17:39:38.007742 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d6b8dcf69-f7mfs" Feb 27 17:39:38 crc kubenswrapper[4752]: I0227 17:39:38.038001 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d6b8dcf69-f7mfs"] Feb 27 17:39:38 crc kubenswrapper[4752]: I0227 17:39:38.042216 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5d6b8dcf69-f7mfs"] Feb 27 17:39:38 crc kubenswrapper[4752]: I0227 17:39:38.046748 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxkfw\" (UniqueName: \"kubernetes.io/projected/68d2e603-f5c7-43b0-a311-adfb01551211-kube-api-access-pxkfw\") pod \"route-controller-manager-7b9f458779-wkmbv\" (UID: \"68d2e603-f5c7-43b0-a311-adfb01551211\") " pod="openshift-route-controller-manager/route-controller-manager-7b9f458779-wkmbv" Feb 27 17:39:38 crc kubenswrapper[4752]: I0227 17:39:38.046901 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68d2e603-f5c7-43b0-a311-adfb01551211-serving-cert\") pod \"route-controller-manager-7b9f458779-wkmbv\" (UID: \"68d2e603-f5c7-43b0-a311-adfb01551211\") " pod="openshift-route-controller-manager/route-controller-manager-7b9f458779-wkmbv" Feb 27 17:39:38 crc kubenswrapper[4752]: I0227 17:39:38.047018 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/68d2e603-f5c7-43b0-a311-adfb01551211-client-ca\") pod \"route-controller-manager-7b9f458779-wkmbv\" (UID: \"68d2e603-f5c7-43b0-a311-adfb01551211\") " pod="openshift-route-controller-manager/route-controller-manager-7b9f458779-wkmbv" Feb 27 17:39:38 crc kubenswrapper[4752]: I0227 17:39:38.047972 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/68d2e603-f5c7-43b0-a311-adfb01551211-client-ca\") pod \"route-controller-manager-7b9f458779-wkmbv\" (UID: \"68d2e603-f5c7-43b0-a311-adfb01551211\") " pod="openshift-route-controller-manager/route-controller-manager-7b9f458779-wkmbv" Feb 27 17:39:38 crc kubenswrapper[4752]: I0227 17:39:38.050675 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68d2e603-f5c7-43b0-a311-adfb01551211-config\") pod \"route-controller-manager-7b9f458779-wkmbv\" (UID: \"68d2e603-f5c7-43b0-a311-adfb01551211\") " pod="openshift-route-controller-manager/route-controller-manager-7b9f458779-wkmbv" Feb 27 17:39:38 crc kubenswrapper[4752]: I0227 17:39:38.050741 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfwk4\" (UniqueName: \"kubernetes.io/projected/bf462527-46a1-4092-b3de-bef4e89e73a6-kube-api-access-gfwk4\") on node \"crc\" DevicePath \"\"" Feb 27 17:39:38 crc kubenswrapper[4752]: I0227 17:39:38.050754 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3297aa73-488b-4cac-bb32-defff1256ff3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:39:38 crc kubenswrapper[4752]: I0227 17:39:38.050764 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3297aa73-488b-4cac-bb32-defff1256ff3-config\") on node \"crc\" DevicePath \"\"" Feb 27 17:39:38 crc kubenswrapper[4752]: I0227 17:39:38.050775 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf462527-46a1-4092-b3de-bef4e89e73a6-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:39:38 crc kubenswrapper[4752]: I0227 17:39:38.050785 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf462527-46a1-4092-b3de-bef4e89e73a6-config\") on node \"crc\" DevicePath \"\"" Feb 27 17:39:38 crc kubenswrapper[4752]: I0227 17:39:38.050793 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xczj\" (UniqueName: \"kubernetes.io/projected/3297aa73-488b-4cac-bb32-defff1256ff3-kube-api-access-8xczj\") on node \"crc\" DevicePath \"\"" Feb 27 17:39:38 crc kubenswrapper[4752]: I0227 17:39:38.050801 4752 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bf462527-46a1-4092-b3de-bef4e89e73a6-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 27 17:39:38 crc kubenswrapper[4752]: I0227 17:39:38.054570 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68d2e603-f5c7-43b0-a311-adfb01551211-serving-cert\") pod \"route-controller-manager-7b9f458779-wkmbv\" (UID: \"68d2e603-f5c7-43b0-a311-adfb01551211\") " pod="openshift-route-controller-manager/route-controller-manager-7b9f458779-wkmbv" Feb 27 17:39:38 crc kubenswrapper[4752]: I0227 17:39:38.054974 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68d2e603-f5c7-43b0-a311-adfb01551211-config\") pod \"route-controller-manager-7b9f458779-wkmbv\" (UID: \"68d2e603-f5c7-43b0-a311-adfb01551211\") " pod="openshift-route-controller-manager/route-controller-manager-7b9f458779-wkmbv" Feb 27 17:39:38 crc kubenswrapper[4752]: I0227 17:39:38.057989 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84fb987488-h86k2"] Feb 27 17:39:38 crc kubenswrapper[4752]: I0227 17:39:38.061339 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84fb987488-h86k2"] Feb 27 17:39:38 crc kubenswrapper[4752]: I0227 17:39:38.072725 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxkfw\" (UniqueName: \"kubernetes.io/projected/68d2e603-f5c7-43b0-a311-adfb01551211-kube-api-access-pxkfw\") pod \"route-controller-manager-7b9f458779-wkmbv\" (UID: \"68d2e603-f5c7-43b0-a311-adfb01551211\") " pod="openshift-route-controller-manager/route-controller-manager-7b9f458779-wkmbv" Feb 27 17:39:38 crc kubenswrapper[4752]: I0227 17:39:38.122643 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b9f458779-wkmbv" Feb 27 17:39:38 crc kubenswrapper[4752]: I0227 17:39:38.915914 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3297aa73-488b-4cac-bb32-defff1256ff3" path="/var/lib/kubelet/pods/3297aa73-488b-4cac-bb32-defff1256ff3/volumes" Feb 27 17:39:38 crc kubenswrapper[4752]: I0227 17:39:38.919059 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf462527-46a1-4092-b3de-bef4e89e73a6" path="/var/lib/kubelet/pods/bf462527-46a1-4092-b3de-bef4e89e73a6/volumes" Feb 27 17:39:40 crc kubenswrapper[4752]: I0227 17:39:40.343681 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-786d4d7bd5-rc4rg"] Feb 27 17:39:40 crc kubenswrapper[4752]: I0227 17:39:40.345259 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-786d4d7bd5-rc4rg" Feb 27 17:39:40 crc kubenswrapper[4752]: I0227 17:39:40.352207 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 27 17:39:40 crc kubenswrapper[4752]: I0227 17:39:40.352320 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 27 17:39:40 crc kubenswrapper[4752]: I0227 17:39:40.353336 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 27 17:39:40 crc kubenswrapper[4752]: I0227 17:39:40.353846 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 27 17:39:40 crc kubenswrapper[4752]: I0227 17:39:40.354610 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 27 17:39:40 crc kubenswrapper[4752]: I0227 17:39:40.354841 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 27 17:39:40 crc kubenswrapper[4752]: I0227 17:39:40.360047 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-786d4d7bd5-rc4rg"] Feb 27 17:39:40 crc kubenswrapper[4752]: I0227 17:39:40.363504 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 27 17:39:40 crc kubenswrapper[4752]: I0227 17:39:40.485722 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/805dd32a-9306-453a-bae5-9f5914e4afdc-proxy-ca-bundles\") pod \"controller-manager-786d4d7bd5-rc4rg\" (UID: \"805dd32a-9306-453a-bae5-9f5914e4afdc\") " pod="openshift-controller-manager/controller-manager-786d4d7bd5-rc4rg" Feb 27 17:39:40 crc kubenswrapper[4752]: I0227 17:39:40.485841 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx286\" (UniqueName: \"kubernetes.io/projected/805dd32a-9306-453a-bae5-9f5914e4afdc-kube-api-access-sx286\") pod \"controller-manager-786d4d7bd5-rc4rg\" (UID: \"805dd32a-9306-453a-bae5-9f5914e4afdc\") " pod="openshift-controller-manager/controller-manager-786d4d7bd5-rc4rg" Feb 27 17:39:40 crc kubenswrapper[4752]: I0227 17:39:40.485899 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/805dd32a-9306-453a-bae5-9f5914e4afdc-config\") pod \"controller-manager-786d4d7bd5-rc4rg\" (UID: \"805dd32a-9306-453a-bae5-9f5914e4afdc\") " pod="openshift-controller-manager/controller-manager-786d4d7bd5-rc4rg" Feb 27 17:39:40 crc kubenswrapper[4752]: I0227 17:39:40.486327 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/805dd32a-9306-453a-bae5-9f5914e4afdc-serving-cert\") pod \"controller-manager-786d4d7bd5-rc4rg\" (UID: \"805dd32a-9306-453a-bae5-9f5914e4afdc\") " pod="openshift-controller-manager/controller-manager-786d4d7bd5-rc4rg" Feb 27 17:39:40 crc kubenswrapper[4752]: I0227 17:39:40.486424 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/805dd32a-9306-453a-bae5-9f5914e4afdc-client-ca\") pod \"controller-manager-786d4d7bd5-rc4rg\" (UID: \"805dd32a-9306-453a-bae5-9f5914e4afdc\") " pod="openshift-controller-manager/controller-manager-786d4d7bd5-rc4rg" Feb 27 17:39:40 crc kubenswrapper[4752]: I0227 17:39:40.587238 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/805dd32a-9306-453a-bae5-9f5914e4afdc-serving-cert\") pod \"controller-manager-786d4d7bd5-rc4rg\" (UID: \"805dd32a-9306-453a-bae5-9f5914e4afdc\") " pod="openshift-controller-manager/controller-manager-786d4d7bd5-rc4rg" Feb 27 17:39:40 crc kubenswrapper[4752]: I0227 17:39:40.587317 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/805dd32a-9306-453a-bae5-9f5914e4afdc-client-ca\") pod \"controller-manager-786d4d7bd5-rc4rg\" (UID: \"805dd32a-9306-453a-bae5-9f5914e4afdc\") " pod="openshift-controller-manager/controller-manager-786d4d7bd5-rc4rg" Feb 27 17:39:40 crc kubenswrapper[4752]: I0227 17:39:40.587367 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/805dd32a-9306-453a-bae5-9f5914e4afdc-proxy-ca-bundles\") pod \"controller-manager-786d4d7bd5-rc4rg\" (UID: \"805dd32a-9306-453a-bae5-9f5914e4afdc\") " pod="openshift-controller-manager/controller-manager-786d4d7bd5-rc4rg" Feb 27 17:39:40 crc kubenswrapper[4752]: I0227 17:39:40.587405 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx286\" (UniqueName: \"kubernetes.io/projected/805dd32a-9306-453a-bae5-9f5914e4afdc-kube-api-access-sx286\") pod \"controller-manager-786d4d7bd5-rc4rg\" (UID: \"805dd32a-9306-453a-bae5-9f5914e4afdc\") " pod="openshift-controller-manager/controller-manager-786d4d7bd5-rc4rg" Feb 27 17:39:40 crc kubenswrapper[4752]: I0227 17:39:40.587438 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/805dd32a-9306-453a-bae5-9f5914e4afdc-config\") pod \"controller-manager-786d4d7bd5-rc4rg\" (UID: \"805dd32a-9306-453a-bae5-9f5914e4afdc\") " pod="openshift-controller-manager/controller-manager-786d4d7bd5-rc4rg" Feb 27 17:39:40 crc kubenswrapper[4752]: I0227 17:39:40.588981 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/805dd32a-9306-453a-bae5-9f5914e4afdc-client-ca\") pod \"controller-manager-786d4d7bd5-rc4rg\" (UID: \"805dd32a-9306-453a-bae5-9f5914e4afdc\") " pod="openshift-controller-manager/controller-manager-786d4d7bd5-rc4rg" Feb 27 17:39:40 crc kubenswrapper[4752]: I0227 17:39:40.589287 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/805dd32a-9306-453a-bae5-9f5914e4afdc-proxy-ca-bundles\") pod \"controller-manager-786d4d7bd5-rc4rg\" (UID: \"805dd32a-9306-453a-bae5-9f5914e4afdc\") " pod="openshift-controller-manager/controller-manager-786d4d7bd5-rc4rg" Feb 27 17:39:40 crc kubenswrapper[4752]: I0227 17:39:40.590508 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/805dd32a-9306-453a-bae5-9f5914e4afdc-config\") pod \"controller-manager-786d4d7bd5-rc4rg\" (UID: \"805dd32a-9306-453a-bae5-9f5914e4afdc\") " pod="openshift-controller-manager/controller-manager-786d4d7bd5-rc4rg" Feb 27 17:39:40 crc kubenswrapper[4752]: I0227 17:39:40.593162 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/805dd32a-9306-453a-bae5-9f5914e4afdc-serving-cert\") pod \"controller-manager-786d4d7bd5-rc4rg\" (UID: \"805dd32a-9306-453a-bae5-9f5914e4afdc\") " pod="openshift-controller-manager/controller-manager-786d4d7bd5-rc4rg" Feb 27 17:39:40 crc kubenswrapper[4752]: I0227 17:39:40.605088 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx286\" (UniqueName: \"kubernetes.io/projected/805dd32a-9306-453a-bae5-9f5914e4afdc-kube-api-access-sx286\") pod \"controller-manager-786d4d7bd5-rc4rg\" (UID: \"805dd32a-9306-453a-bae5-9f5914e4afdc\") " pod="openshift-controller-manager/controller-manager-786d4d7bd5-rc4rg" Feb 27 17:39:40 crc kubenswrapper[4752]: I0227 17:39:40.685092 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-786d4d7bd5-rc4rg" Feb 27 17:39:44 crc kubenswrapper[4752]: E0227 17:39:44.600565 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-b7x2z" podUID="899d1101-b4de-4326-b442-6450903b2a30" Feb 27 17:39:44 crc kubenswrapper[4752]: E0227 17:39:44.600690 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-694vw" podUID="2db85625-5324-4606-a2f1-740416e8d218" Feb 27 17:39:44 crc kubenswrapper[4752]: E0227 17:39:44.699992 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 27 17:39:44 crc kubenswrapper[4752]: E0227 17:39:44.700261 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-92dcf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-6kwhk_openshift-marketplace(78323811-0abf-4cc6-921c-5d0e56e895a3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 17:39:44 crc kubenswrapper[4752]: E0227 17:39:44.702626 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-6kwhk" podUID="78323811-0abf-4cc6-921c-5d0e56e895a3" Feb 27 17:39:46 crc kubenswrapper[4752]: I0227 17:39:46.080182 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dmrgx" Feb 27 17:39:48 crc kubenswrapper[4752]: E0227 17:39:48.421449 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-t8w4t" podUID="ebdbb722-11b5-43c4-b8dc-8758bbc7164c" Feb 27 17:39:48 crc kubenswrapper[4752]: E0227 17:39:48.421667 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-6kwhk" podUID="78323811-0abf-4cc6-921c-5d0e56e895a3" Feb 27 17:39:48 crc kubenswrapper[4752]: I0227 17:39:48.434349 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-786d4d7bd5-rc4rg"] Feb 27 17:39:48 crc kubenswrapper[4752]: I0227 17:39:48.534070 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b9f458779-wkmbv"] Feb 27 17:39:48 crc kubenswrapper[4752]: E0227 17:39:48.590647 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 27 17:39:48 crc kubenswrapper[4752]: E0227 17:39:48.590812 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dxkqq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-cllc6_openshift-marketplace(049892e0-329b-442b-b232-997ee454f9c6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 17:39:48 crc kubenswrapper[4752]: E0227 17:39:48.592017 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-cllc6" podUID="049892e0-329b-442b-b232-997ee454f9c6" Feb 27 17:39:48 crc kubenswrapper[4752]: I0227 17:39:48.710018 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 27 17:39:48 crc kubenswrapper[4752]: I0227 17:39:48.710997 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 17:39:48 crc kubenswrapper[4752]: I0227 17:39:48.713536 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 27 17:39:48 crc kubenswrapper[4752]: I0227 17:39:48.713651 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 27 17:39:48 crc kubenswrapper[4752]: I0227 17:39:48.715184 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 27 17:39:48 crc kubenswrapper[4752]: I0227 17:39:48.904843 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1a007403-6f48-4177-b7f7-fcca306d531e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1a007403-6f48-4177-b7f7-fcca306d531e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 17:39:48 crc kubenswrapper[4752]: I0227 17:39:48.905393 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1a007403-6f48-4177-b7f7-fcca306d531e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1a007403-6f48-4177-b7f7-fcca306d531e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 17:39:49 crc kubenswrapper[4752]: I0227 17:39:49.007683 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1a007403-6f48-4177-b7f7-fcca306d531e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1a007403-6f48-4177-b7f7-fcca306d531e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 17:39:49 crc kubenswrapper[4752]: I0227 17:39:49.007790 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1a007403-6f48-4177-b7f7-fcca306d531e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1a007403-6f48-4177-b7f7-fcca306d531e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 17:39:49 crc kubenswrapper[4752]: I0227 17:39:49.008103 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1a007403-6f48-4177-b7f7-fcca306d531e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1a007403-6f48-4177-b7f7-fcca306d531e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 17:39:49 crc kubenswrapper[4752]: I0227 17:39:49.037675 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1a007403-6f48-4177-b7f7-fcca306d531e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1a007403-6f48-4177-b7f7-fcca306d531e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 17:39:49 crc kubenswrapper[4752]: I0227 17:39:49.329883 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 17:39:50 crc kubenswrapper[4752]: E0227 17:39:50.209942 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-cllc6" podUID="049892e0-329b-442b-b232-997ee454f9c6" Feb 27 17:39:50 crc kubenswrapper[4752]: E0227 17:39:50.257734 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 27 17:39:50 crc kubenswrapper[4752]: E0227 17:39:50.257893 4752 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 17:39:50 crc kubenswrapper[4752]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 27 17:39:50 crc kubenswrapper[4752]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-r9djm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29536898-598km_openshift-infra(cc36acda-9447-479d-b741-c063ecb91f3e): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error) Feb 27 17:39:50 crc kubenswrapper[4752]: > logger="UnhandledError" Feb 27 17:39:50 crc kubenswrapper[4752]: E0227 17:39:50.259134 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)\"" pod="openshift-infra/auto-csr-approver-29536898-598km" podUID="cc36acda-9447-479d-b741-c063ecb91f3e" Feb 27 17:39:50 crc kubenswrapper[4752]: I0227 17:39:50.259894 4752 scope.go:117] "RemoveContainer" containerID="4e9df117af9fe9519deb3bbec86abd4af0521b55710df1bce23fa6f6d48a452c" Feb 27 17:39:50 crc kubenswrapper[4752]: E0227 17:39:50.307238 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 27 17:39:50 crc kubenswrapper[4752]: E0227 17:39:50.308092 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-btb9f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-dm9bt_openshift-marketplace(cad177e6-5ee1-4884-bb19-b9413b183acc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 17:39:50 crc kubenswrapper[4752]: E0227 17:39:50.310212 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-dm9bt" podUID="cad177e6-5ee1-4884-bb19-b9413b183acc" Feb 27 17:39:50 crc kubenswrapper[4752]: E0227 17:39:50.344618 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 27 17:39:50 crc kubenswrapper[4752]: E0227 17:39:50.344813 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l262m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-qvj4t_openshift-marketplace(137184c7-4f82-4685-89fa-d5152358e216): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 17:39:50 crc kubenswrapper[4752]: E0227 17:39:50.346261 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-qvj4t" podUID="137184c7-4f82-4685-89fa-d5152358e216" Feb 27 17:39:50 crc kubenswrapper[4752]: E0227 17:39:50.422958 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 27 17:39:50 crc kubenswrapper[4752]: E0227 17:39:50.423101 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-24ds4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-zhhxr_openshift-marketplace(760298d8-7405-4c9e-b322-b08dbc182da8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 27 17:39:50 crc kubenswrapper[4752]: E0227 17:39:50.424472 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-zhhxr" podUID="760298d8-7405-4c9e-b322-b08dbc182da8" Feb 27 17:39:50 crc kubenswrapper[4752]: I0227 17:39:50.747455 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 27 17:39:50 crc kubenswrapper[4752]: I0227 17:39:50.750970 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b9f458779-wkmbv"] Feb 27 17:39:50 crc kubenswrapper[4752]: I0227 17:39:50.753358 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-786d4d7bd5-rc4rg"] Feb 27 17:39:50 crc kubenswrapper[4752]: W0227 17:39:50.757568 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1a007403_6f48_4177_b7f7_fcca306d531e.slice/crio-625ebd41b1f51cde59ee0f36726a53c47274bbd78eb4175584b71d75ff66147b WatchSource:0}: Error finding container 625ebd41b1f51cde59ee0f36726a53c47274bbd78eb4175584b71d75ff66147b: Status 404 returned error can't find the container with id 625ebd41b1f51cde59ee0f36726a53c47274bbd78eb4175584b71d75ff66147b Feb 27 17:39:50 crc kubenswrapper[4752]: W0227 17:39:50.775347 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod805dd32a_9306_453a_bae5_9f5914e4afdc.slice/crio-18d7bf521d6088ddbf1d1eaad00d198899ea32f92b36b241045d241be010e306 WatchSource:0}: Error finding container 18d7bf521d6088ddbf1d1eaad00d198899ea32f92b36b241045d241be010e306: Status 404 returned error can't find the container with id 18d7bf521d6088ddbf1d1eaad00d198899ea32f92b36b241045d241be010e306 Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.108017 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-786d4d7bd5-rc4rg" event={"ID":"805dd32a-9306-453a-bae5-9f5914e4afdc","Type":"ContainerStarted","Data":"d26a32b814dd826e64e1eafa4673af019c03af28b2b184edb0253348d433f18f"} Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.108661 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-786d4d7bd5-rc4rg" event={"ID":"805dd32a-9306-453a-bae5-9f5914e4afdc","Type":"ContainerStarted","Data":"18d7bf521d6088ddbf1d1eaad00d198899ea32f92b36b241045d241be010e306"} Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.108692 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-786d4d7bd5-rc4rg" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.108306 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-786d4d7bd5-rc4rg" podUID="805dd32a-9306-453a-bae5-9f5914e4afdc" containerName="controller-manager" containerID="cri-o://d26a32b814dd826e64e1eafa4673af019c03af28b2b184edb0253348d433f18f" gracePeriod=30 Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.121385 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"1a007403-6f48-4177-b7f7-fcca306d531e","Type":"ContainerStarted","Data":"625ebd41b1f51cde59ee0f36726a53c47274bbd78eb4175584b71d75ff66147b"} Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.121720 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-786d4d7bd5-rc4rg" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.126781 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b9f458779-wkmbv" event={"ID":"68d2e603-f5c7-43b0-a311-adfb01551211","Type":"ContainerStarted","Data":"27d4773780cf3d27a2c86bcb3a797f23baaa6f82b1793d67aa27b9eb0fce6969"} Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.126839 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b9f458779-wkmbv" event={"ID":"68d2e603-f5c7-43b0-a311-adfb01551211","Type":"ContainerStarted","Data":"e8cfa7dc686516f48012affec81d4e2624ac957a387987107e82c667506793b6"} Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.127038 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7b9f458779-wkmbv" podUID="68d2e603-f5c7-43b0-a311-adfb01551211" containerName="route-controller-manager" containerID="cri-o://27d4773780cf3d27a2c86bcb3a797f23baaa6f82b1793d67aa27b9eb0fce6969" gracePeriod=30 Feb 27 17:39:51 crc kubenswrapper[4752]: E0227 17:39:51.128486 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-zhhxr" podUID="760298d8-7405-4c9e-b322-b08dbc182da8" Feb 27 17:39:51 crc kubenswrapper[4752]: E0227 17:39:51.128815 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-dm9bt" podUID="cad177e6-5ee1-4884-bb19-b9413b183acc" Feb 27 17:39:51 crc kubenswrapper[4752]: E0227 17:39:51.129517 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-qvj4t" podUID="137184c7-4f82-4685-89fa-d5152358e216" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.138602 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-786d4d7bd5-rc4rg" podStartSLOduration=23.13858671 podStartE2EDuration="23.13858671s" podCreationTimestamp="2026-02-27 17:39:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:51.135965386 +0000 UTC m=+291.042782247" watchObservedRunningTime="2026-02-27 17:39:51.13858671 +0000 UTC m=+291.045403561" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.177308 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7b9f458779-wkmbv" podStartSLOduration=23.177281878 podStartE2EDuration="23.177281878s" podCreationTimestamp="2026-02-27 17:39:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:51.156919734 +0000 UTC m=+291.063736605" watchObservedRunningTime="2026-02-27 17:39:51.177281878 +0000 UTC m=+291.084098729" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.505291 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-786d4d7bd5-rc4rg" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.545217 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5d45888664-sjzsx"] Feb 27 17:39:51 crc kubenswrapper[4752]: E0227 17:39:51.547195 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="805dd32a-9306-453a-bae5-9f5914e4afdc" containerName="controller-manager" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.547221 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="805dd32a-9306-453a-bae5-9f5914e4afdc" containerName="controller-manager" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.552535 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="805dd32a-9306-453a-bae5-9f5914e4afdc" containerName="controller-manager" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.552560 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sx286\" (UniqueName: \"kubernetes.io/projected/805dd32a-9306-453a-bae5-9f5914e4afdc-kube-api-access-sx286\") pod \"805dd32a-9306-453a-bae5-9f5914e4afdc\" (UID: \"805dd32a-9306-453a-bae5-9f5914e4afdc\") " Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.552988 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d45888664-sjzsx"] Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.553087 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d45888664-sjzsx" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.554792 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/805dd32a-9306-453a-bae5-9f5914e4afdc-serving-cert\") pod \"805dd32a-9306-453a-bae5-9f5914e4afdc\" (UID: \"805dd32a-9306-453a-bae5-9f5914e4afdc\") " Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.555324 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/805dd32a-9306-453a-bae5-9f5914e4afdc-config\") pod \"805dd32a-9306-453a-bae5-9f5914e4afdc\" (UID: \"805dd32a-9306-453a-bae5-9f5914e4afdc\") " Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.555386 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/805dd32a-9306-453a-bae5-9f5914e4afdc-client-ca\") pod \"805dd32a-9306-453a-bae5-9f5914e4afdc\" (UID: \"805dd32a-9306-453a-bae5-9f5914e4afdc\") " Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.555421 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/805dd32a-9306-453a-bae5-9f5914e4afdc-proxy-ca-bundles\") pod \"805dd32a-9306-453a-bae5-9f5914e4afdc\" (UID: \"805dd32a-9306-453a-bae5-9f5914e4afdc\") " Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.556542 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/805dd32a-9306-453a-bae5-9f5914e4afdc-client-ca" (OuterVolumeSpecName: "client-ca") pod "805dd32a-9306-453a-bae5-9f5914e4afdc" (UID: "805dd32a-9306-453a-bae5-9f5914e4afdc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.556579 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/805dd32a-9306-453a-bae5-9f5914e4afdc-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "805dd32a-9306-453a-bae5-9f5914e4afdc" (UID: "805dd32a-9306-453a-bae5-9f5914e4afdc"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.556985 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/805dd32a-9306-453a-bae5-9f5914e4afdc-config" (OuterVolumeSpecName: "config") pod "805dd32a-9306-453a-bae5-9f5914e4afdc" (UID: "805dd32a-9306-453a-bae5-9f5914e4afdc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.561078 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/805dd32a-9306-453a-bae5-9f5914e4afdc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "805dd32a-9306-453a-bae5-9f5914e4afdc" (UID: "805dd32a-9306-453a-bae5-9f5914e4afdc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.564884 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/805dd32a-9306-453a-bae5-9f5914e4afdc-kube-api-access-sx286" (OuterVolumeSpecName: "kube-api-access-sx286") pod "805dd32a-9306-453a-bae5-9f5914e4afdc" (UID: "805dd32a-9306-453a-bae5-9f5914e4afdc"). InnerVolumeSpecName "kube-api-access-sx286". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.657657 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/599b1126-dacf-42a6-aabd-84f8177774cd-proxy-ca-bundles\") pod \"controller-manager-5d45888664-sjzsx\" (UID: \"599b1126-dacf-42a6-aabd-84f8177774cd\") " pod="openshift-controller-manager/controller-manager-5d45888664-sjzsx" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.657734 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/599b1126-dacf-42a6-aabd-84f8177774cd-client-ca\") pod \"controller-manager-5d45888664-sjzsx\" (UID: \"599b1126-dacf-42a6-aabd-84f8177774cd\") " pod="openshift-controller-manager/controller-manager-5d45888664-sjzsx" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.657752 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24p2j\" (UniqueName: \"kubernetes.io/projected/599b1126-dacf-42a6-aabd-84f8177774cd-kube-api-access-24p2j\") pod \"controller-manager-5d45888664-sjzsx\" (UID: \"599b1126-dacf-42a6-aabd-84f8177774cd\") " pod="openshift-controller-manager/controller-manager-5d45888664-sjzsx" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.657784 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/599b1126-dacf-42a6-aabd-84f8177774cd-config\") pod \"controller-manager-5d45888664-sjzsx\" (UID: \"599b1126-dacf-42a6-aabd-84f8177774cd\") " pod="openshift-controller-manager/controller-manager-5d45888664-sjzsx" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.657802 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/599b1126-dacf-42a6-aabd-84f8177774cd-serving-cert\") pod \"controller-manager-5d45888664-sjzsx\" (UID: \"599b1126-dacf-42a6-aabd-84f8177774cd\") " pod="openshift-controller-manager/controller-manager-5d45888664-sjzsx" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.657860 4752 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/805dd32a-9306-453a-bae5-9f5914e4afdc-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.657871 4752 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/805dd32a-9306-453a-bae5-9f5914e4afdc-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.657881 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sx286\" (UniqueName: \"kubernetes.io/projected/805dd32a-9306-453a-bae5-9f5914e4afdc-kube-api-access-sx286\") on node \"crc\" DevicePath \"\"" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.657891 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/805dd32a-9306-453a-bae5-9f5914e4afdc-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.657899 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/805dd32a-9306-453a-bae5-9f5914e4afdc-config\") on node \"crc\" DevicePath \"\"" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.662901 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-7b9f458779-wkmbv_68d2e603-f5c7-43b0-a311-adfb01551211/route-controller-manager/0.log" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.662964 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b9f458779-wkmbv" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.761924 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68d2e603-f5c7-43b0-a311-adfb01551211-config\") pod \"68d2e603-f5c7-43b0-a311-adfb01551211\" (UID: \"68d2e603-f5c7-43b0-a311-adfb01551211\") " Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.762003 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68d2e603-f5c7-43b0-a311-adfb01551211-serving-cert\") pod \"68d2e603-f5c7-43b0-a311-adfb01551211\" (UID: \"68d2e603-f5c7-43b0-a311-adfb01551211\") " Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.762041 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/68d2e603-f5c7-43b0-a311-adfb01551211-client-ca\") pod \"68d2e603-f5c7-43b0-a311-adfb01551211\" (UID: \"68d2e603-f5c7-43b0-a311-adfb01551211\") " Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.762071 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxkfw\" (UniqueName: \"kubernetes.io/projected/68d2e603-f5c7-43b0-a311-adfb01551211-kube-api-access-pxkfw\") pod \"68d2e603-f5c7-43b0-a311-adfb01551211\" (UID: \"68d2e603-f5c7-43b0-a311-adfb01551211\") " Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.763312 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/599b1126-dacf-42a6-aabd-84f8177774cd-proxy-ca-bundles\") pod \"controller-manager-5d45888664-sjzsx\" (UID: \"599b1126-dacf-42a6-aabd-84f8177774cd\") " pod="openshift-controller-manager/controller-manager-5d45888664-sjzsx" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.763378 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/599b1126-dacf-42a6-aabd-84f8177774cd-client-ca\") pod \"controller-manager-5d45888664-sjzsx\" (UID: \"599b1126-dacf-42a6-aabd-84f8177774cd\") " pod="openshift-controller-manager/controller-manager-5d45888664-sjzsx" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.763401 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24p2j\" (UniqueName: \"kubernetes.io/projected/599b1126-dacf-42a6-aabd-84f8177774cd-kube-api-access-24p2j\") pod \"controller-manager-5d45888664-sjzsx\" (UID: \"599b1126-dacf-42a6-aabd-84f8177774cd\") " pod="openshift-controller-manager/controller-manager-5d45888664-sjzsx" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.763435 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/599b1126-dacf-42a6-aabd-84f8177774cd-config\") pod \"controller-manager-5d45888664-sjzsx\" (UID: \"599b1126-dacf-42a6-aabd-84f8177774cd\") " pod="openshift-controller-manager/controller-manager-5d45888664-sjzsx" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.763459 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/599b1126-dacf-42a6-aabd-84f8177774cd-serving-cert\") pod \"controller-manager-5d45888664-sjzsx\" (UID: \"599b1126-dacf-42a6-aabd-84f8177774cd\") " pod="openshift-controller-manager/controller-manager-5d45888664-sjzsx" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.763546 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68d2e603-f5c7-43b0-a311-adfb01551211-config" (OuterVolumeSpecName: "config") pod "68d2e603-f5c7-43b0-a311-adfb01551211" (UID: "68d2e603-f5c7-43b0-a311-adfb01551211"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.763537 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68d2e603-f5c7-43b0-a311-adfb01551211-client-ca" (OuterVolumeSpecName: "client-ca") pod "68d2e603-f5c7-43b0-a311-adfb01551211" (UID: "68d2e603-f5c7-43b0-a311-adfb01551211"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.764501 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/599b1126-dacf-42a6-aabd-84f8177774cd-client-ca\") pod \"controller-manager-5d45888664-sjzsx\" (UID: \"599b1126-dacf-42a6-aabd-84f8177774cd\") " pod="openshift-controller-manager/controller-manager-5d45888664-sjzsx" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.765119 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/599b1126-dacf-42a6-aabd-84f8177774cd-proxy-ca-bundles\") pod \"controller-manager-5d45888664-sjzsx\" (UID: \"599b1126-dacf-42a6-aabd-84f8177774cd\") " pod="openshift-controller-manager/controller-manager-5d45888664-sjzsx" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.767877 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68d2e603-f5c7-43b0-a311-adfb01551211-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "68d2e603-f5c7-43b0-a311-adfb01551211" (UID: "68d2e603-f5c7-43b0-a311-adfb01551211"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.767898 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/599b1126-dacf-42a6-aabd-84f8177774cd-serving-cert\") pod \"controller-manager-5d45888664-sjzsx\" (UID: \"599b1126-dacf-42a6-aabd-84f8177774cd\") " pod="openshift-controller-manager/controller-manager-5d45888664-sjzsx" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.773528 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68d2e603-f5c7-43b0-a311-adfb01551211-kube-api-access-pxkfw" (OuterVolumeSpecName: "kube-api-access-pxkfw") pod "68d2e603-f5c7-43b0-a311-adfb01551211" (UID: "68d2e603-f5c7-43b0-a311-adfb01551211"). InnerVolumeSpecName "kube-api-access-pxkfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.774101 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/599b1126-dacf-42a6-aabd-84f8177774cd-config\") pod \"controller-manager-5d45888664-sjzsx\" (UID: \"599b1126-dacf-42a6-aabd-84f8177774cd\") " pod="openshift-controller-manager/controller-manager-5d45888664-sjzsx" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.783857 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24p2j\" (UniqueName: \"kubernetes.io/projected/599b1126-dacf-42a6-aabd-84f8177774cd-kube-api-access-24p2j\") pod \"controller-manager-5d45888664-sjzsx\" (UID: \"599b1126-dacf-42a6-aabd-84f8177774cd\") " pod="openshift-controller-manager/controller-manager-5d45888664-sjzsx" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.865222 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68d2e603-f5c7-43b0-a311-adfb01551211-config\") on node \"crc\" DevicePath \"\"" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.865286 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68d2e603-f5c7-43b0-a311-adfb01551211-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.865309 4752 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/68d2e603-f5c7-43b0-a311-adfb01551211-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.865327 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxkfw\" (UniqueName: \"kubernetes.io/projected/68d2e603-f5c7-43b0-a311-adfb01551211-kube-api-access-pxkfw\") on node \"crc\" DevicePath \"\"" Feb 27 17:39:51 crc kubenswrapper[4752]: I0227 17:39:51.896701 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d45888664-sjzsx" Feb 27 17:39:52 crc kubenswrapper[4752]: I0227 17:39:52.136043 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-7b9f458779-wkmbv_68d2e603-f5c7-43b0-a311-adfb01551211/route-controller-manager/0.log" Feb 27 17:39:52 crc kubenswrapper[4752]: I0227 17:39:52.136104 4752 generic.go:334] "Generic (PLEG): container finished" podID="68d2e603-f5c7-43b0-a311-adfb01551211" containerID="27d4773780cf3d27a2c86bcb3a797f23baaa6f82b1793d67aa27b9eb0fce6969" exitCode=255 Feb 27 17:39:52 crc kubenswrapper[4752]: I0227 17:39:52.136232 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b9f458779-wkmbv" event={"ID":"68d2e603-f5c7-43b0-a311-adfb01551211","Type":"ContainerDied","Data":"27d4773780cf3d27a2c86bcb3a797f23baaa6f82b1793d67aa27b9eb0fce6969"} Feb 27 17:39:52 crc kubenswrapper[4752]: I0227 17:39:52.136238 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b9f458779-wkmbv" Feb 27 17:39:52 crc kubenswrapper[4752]: I0227 17:39:52.136267 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b9f458779-wkmbv" event={"ID":"68d2e603-f5c7-43b0-a311-adfb01551211","Type":"ContainerDied","Data":"e8cfa7dc686516f48012affec81d4e2624ac957a387987107e82c667506793b6"} Feb 27 17:39:52 crc kubenswrapper[4752]: I0227 17:39:52.136290 4752 scope.go:117] "RemoveContainer" containerID="27d4773780cf3d27a2c86bcb3a797f23baaa6f82b1793d67aa27b9eb0fce6969" Feb 27 17:39:52 crc kubenswrapper[4752]: I0227 17:39:52.138852 4752 generic.go:334] "Generic (PLEG): container finished" podID="805dd32a-9306-453a-bae5-9f5914e4afdc" containerID="d26a32b814dd826e64e1eafa4673af019c03af28b2b184edb0253348d433f18f" exitCode=0 Feb 27 17:39:52 crc kubenswrapper[4752]: I0227 17:39:52.138952 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-786d4d7bd5-rc4rg" Feb 27 17:39:52 crc kubenswrapper[4752]: I0227 17:39:52.139183 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-786d4d7bd5-rc4rg" event={"ID":"805dd32a-9306-453a-bae5-9f5914e4afdc","Type":"ContainerDied","Data":"d26a32b814dd826e64e1eafa4673af019c03af28b2b184edb0253348d433f18f"} Feb 27 17:39:52 crc kubenswrapper[4752]: I0227 17:39:52.139221 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-786d4d7bd5-rc4rg" event={"ID":"805dd32a-9306-453a-bae5-9f5914e4afdc","Type":"ContainerDied","Data":"18d7bf521d6088ddbf1d1eaad00d198899ea32f92b36b241045d241be010e306"} Feb 27 17:39:52 crc kubenswrapper[4752]: I0227 17:39:52.139236 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d45888664-sjzsx"] Feb 27 17:39:52 crc kubenswrapper[4752]: I0227 17:39:52.141395 4752 generic.go:334] "Generic (PLEG): container finished" podID="1a007403-6f48-4177-b7f7-fcca306d531e" containerID="7d0455f5f2104102fca3a2afe82a4d20d59ac1bd699810420e3c085513d512d3" exitCode=0 Feb 27 17:39:52 crc kubenswrapper[4752]: I0227 17:39:52.141435 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"1a007403-6f48-4177-b7f7-fcca306d531e","Type":"ContainerDied","Data":"7d0455f5f2104102fca3a2afe82a4d20d59ac1bd699810420e3c085513d512d3"} Feb 27 17:39:52 crc kubenswrapper[4752]: I0227 17:39:52.161046 4752 scope.go:117] "RemoveContainer" containerID="27d4773780cf3d27a2c86bcb3a797f23baaa6f82b1793d67aa27b9eb0fce6969" Feb 27 17:39:52 crc kubenswrapper[4752]: E0227 17:39:52.161872 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27d4773780cf3d27a2c86bcb3a797f23baaa6f82b1793d67aa27b9eb0fce6969\": container with ID starting with 27d4773780cf3d27a2c86bcb3a797f23baaa6f82b1793d67aa27b9eb0fce6969 not found: ID does not exist" containerID="27d4773780cf3d27a2c86bcb3a797f23baaa6f82b1793d67aa27b9eb0fce6969" Feb 27 17:39:52 crc kubenswrapper[4752]: I0227 17:39:52.161915 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27d4773780cf3d27a2c86bcb3a797f23baaa6f82b1793d67aa27b9eb0fce6969"} err="failed to get container status \"27d4773780cf3d27a2c86bcb3a797f23baaa6f82b1793d67aa27b9eb0fce6969\": rpc error: code = NotFound desc = could not find container \"27d4773780cf3d27a2c86bcb3a797f23baaa6f82b1793d67aa27b9eb0fce6969\": container with ID starting with 27d4773780cf3d27a2c86bcb3a797f23baaa6f82b1793d67aa27b9eb0fce6969 not found: ID does not exist" Feb 27 17:39:52 crc kubenswrapper[4752]: I0227 17:39:52.161944 4752 scope.go:117] "RemoveContainer" containerID="d26a32b814dd826e64e1eafa4673af019c03af28b2b184edb0253348d433f18f" Feb 27 17:39:52 crc kubenswrapper[4752]: I0227 17:39:52.182525 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b9f458779-wkmbv"] Feb 27 17:39:52 crc kubenswrapper[4752]: I0227 17:39:52.189406 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b9f458779-wkmbv"] Feb 27 17:39:52 crc kubenswrapper[4752]: I0227 17:39:52.192344 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-786d4d7bd5-rc4rg"] Feb 27 17:39:52 crc kubenswrapper[4752]: I0227 17:39:52.192994 4752 scope.go:117] "RemoveContainer" containerID="d26a32b814dd826e64e1eafa4673af019c03af28b2b184edb0253348d433f18f" Feb 27 17:39:52 crc kubenswrapper[4752]: E0227 17:39:52.193394 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d26a32b814dd826e64e1eafa4673af019c03af28b2b184edb0253348d433f18f\": container with ID starting with d26a32b814dd826e64e1eafa4673af019c03af28b2b184edb0253348d433f18f not found: ID does not exist" containerID="d26a32b814dd826e64e1eafa4673af019c03af28b2b184edb0253348d433f18f" Feb 27 17:39:52 crc kubenswrapper[4752]: I0227 17:39:52.193422 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d26a32b814dd826e64e1eafa4673af019c03af28b2b184edb0253348d433f18f"} err="failed to get container status \"d26a32b814dd826e64e1eafa4673af019c03af28b2b184edb0253348d433f18f\": rpc error: code = NotFound desc = could not find container \"d26a32b814dd826e64e1eafa4673af019c03af28b2b184edb0253348d433f18f\": container with ID starting with d26a32b814dd826e64e1eafa4673af019c03af28b2b184edb0253348d433f18f not found: ID does not exist" Feb 27 17:39:52 crc kubenswrapper[4752]: I0227 17:39:52.196270 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-786d4d7bd5-rc4rg"] Feb 27 17:39:52 crc kubenswrapper[4752]: I0227 17:39:52.916425 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68d2e603-f5c7-43b0-a311-adfb01551211" path="/var/lib/kubelet/pods/68d2e603-f5c7-43b0-a311-adfb01551211/volumes" Feb 27 17:39:52 crc kubenswrapper[4752]: I0227 17:39:52.917528 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="805dd32a-9306-453a-bae5-9f5914e4afdc" path="/var/lib/kubelet/pods/805dd32a-9306-453a-bae5-9f5914e4afdc/volumes" Feb 27 17:39:53 crc kubenswrapper[4752]: I0227 17:39:53.153522 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d45888664-sjzsx" event={"ID":"599b1126-dacf-42a6-aabd-84f8177774cd","Type":"ContainerStarted","Data":"fb30b0e0e1bf9f8428eaf261c676ea62d599c347110380afbb611de26796eca7"} Feb 27 17:39:53 crc kubenswrapper[4752]: I0227 17:39:53.153577 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d45888664-sjzsx" event={"ID":"599b1126-dacf-42a6-aabd-84f8177774cd","Type":"ContainerStarted","Data":"0e26009d696737b0f4c09d2a69f038a2bda3893810f7853be825276744a892df"} Feb 27 17:39:53 crc kubenswrapper[4752]: I0227 17:39:53.175356 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5d45888664-sjzsx" podStartSLOduration=5.175339505 podStartE2EDuration="5.175339505s" podCreationTimestamp="2026-02-27 17:39:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:53.173626564 +0000 UTC m=+293.080443415" watchObservedRunningTime="2026-02-27 17:39:53.175339505 +0000 UTC m=+293.082156356" Feb 27 17:39:53 crc kubenswrapper[4752]: I0227 17:39:53.424989 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 17:39:53 crc kubenswrapper[4752]: I0227 17:39:53.496005 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 27 17:39:53 crc kubenswrapper[4752]: E0227 17:39:53.496308 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a007403-6f48-4177-b7f7-fcca306d531e" containerName="pruner" Feb 27 17:39:53 crc kubenswrapper[4752]: I0227 17:39:53.496332 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a007403-6f48-4177-b7f7-fcca306d531e" containerName="pruner" Feb 27 17:39:53 crc kubenswrapper[4752]: E0227 17:39:53.496346 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68d2e603-f5c7-43b0-a311-adfb01551211" containerName="route-controller-manager" Feb 27 17:39:53 crc kubenswrapper[4752]: I0227 17:39:53.496353 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="68d2e603-f5c7-43b0-a311-adfb01551211" containerName="route-controller-manager" Feb 27 17:39:53 crc kubenswrapper[4752]: I0227 17:39:53.496476 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="68d2e603-f5c7-43b0-a311-adfb01551211" containerName="route-controller-manager" Feb 27 17:39:53 crc kubenswrapper[4752]: I0227 17:39:53.496488 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a007403-6f48-4177-b7f7-fcca306d531e" containerName="pruner" Feb 27 17:39:53 crc kubenswrapper[4752]: I0227 17:39:53.497525 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 27 17:39:53 crc kubenswrapper[4752]: I0227 17:39:53.502323 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 27 17:39:53 crc kubenswrapper[4752]: I0227 17:39:53.589271 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1a007403-6f48-4177-b7f7-fcca306d531e-kubelet-dir\") pod \"1a007403-6f48-4177-b7f7-fcca306d531e\" (UID: \"1a007403-6f48-4177-b7f7-fcca306d531e\") " Feb 27 17:39:53 crc kubenswrapper[4752]: I0227 17:39:53.589412 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a007403-6f48-4177-b7f7-fcca306d531e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1a007403-6f48-4177-b7f7-fcca306d531e" (UID: "1a007403-6f48-4177-b7f7-fcca306d531e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 17:39:53 crc kubenswrapper[4752]: I0227 17:39:53.589374 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1a007403-6f48-4177-b7f7-fcca306d531e-kube-api-access\") pod \"1a007403-6f48-4177-b7f7-fcca306d531e\" (UID: \"1a007403-6f48-4177-b7f7-fcca306d531e\") " Feb 27 17:39:53 crc kubenswrapper[4752]: I0227 17:39:53.590728 4752 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1a007403-6f48-4177-b7f7-fcca306d531e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 27 17:39:53 crc kubenswrapper[4752]: I0227 17:39:53.598447 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a007403-6f48-4177-b7f7-fcca306d531e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1a007403-6f48-4177-b7f7-fcca306d531e" (UID: "1a007403-6f48-4177-b7f7-fcca306d531e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:39:53 crc kubenswrapper[4752]: I0227 17:39:53.692253 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba-kubelet-dir\") pod \"installer-9-crc\" (UID: \"1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 17:39:53 crc kubenswrapper[4752]: I0227 17:39:53.693043 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba-var-lock\") pod \"installer-9-crc\" (UID: \"1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 17:39:53 crc kubenswrapper[4752]: I0227 17:39:53.693172 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba-kube-api-access\") pod \"installer-9-crc\" (UID: \"1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 17:39:53 crc kubenswrapper[4752]: I0227 17:39:53.693213 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1a007403-6f48-4177-b7f7-fcca306d531e-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 27 17:39:53 crc kubenswrapper[4752]: I0227 17:39:53.793450 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba-kube-api-access\") pod \"installer-9-crc\" (UID: \"1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 17:39:53 crc kubenswrapper[4752]: I0227 17:39:53.793517 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba-kubelet-dir\") pod \"installer-9-crc\" (UID: \"1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 17:39:53 crc kubenswrapper[4752]: I0227 17:39:53.793546 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba-var-lock\") pod \"installer-9-crc\" (UID: \"1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 17:39:53 crc kubenswrapper[4752]: I0227 17:39:53.793628 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba-var-lock\") pod \"installer-9-crc\" (UID: \"1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 17:39:53 crc kubenswrapper[4752]: I0227 17:39:53.793680 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba-kubelet-dir\") pod \"installer-9-crc\" (UID: \"1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 17:39:53 crc kubenswrapper[4752]: I0227 17:39:53.812495 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba-kube-api-access\") pod \"installer-9-crc\" (UID: \"1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 27 17:39:53 crc kubenswrapper[4752]: I0227 17:39:53.818795 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 27 17:39:54 crc kubenswrapper[4752]: I0227 17:39:54.161284 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"1a007403-6f48-4177-b7f7-fcca306d531e","Type":"ContainerDied","Data":"625ebd41b1f51cde59ee0f36726a53c47274bbd78eb4175584b71d75ff66147b"} Feb 27 17:39:54 crc kubenswrapper[4752]: I0227 17:39:54.161323 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 27 17:39:54 crc kubenswrapper[4752]: I0227 17:39:54.161371 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="625ebd41b1f51cde59ee0f36726a53c47274bbd78eb4175584b71d75ff66147b" Feb 27 17:39:54 crc kubenswrapper[4752]: I0227 17:39:54.161779 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5d45888664-sjzsx" Feb 27 17:39:54 crc kubenswrapper[4752]: I0227 17:39:54.168200 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5d45888664-sjzsx" Feb 27 17:39:54 crc kubenswrapper[4752]: I0227 17:39:54.262274 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 27 17:39:54 crc kubenswrapper[4752]: I0227 17:39:54.275086 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 27 17:39:54 crc kubenswrapper[4752]: I0227 17:39:54.350078 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-548694564-hszg5"] Feb 27 17:39:54 crc kubenswrapper[4752]: I0227 17:39:54.351884 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-548694564-hszg5" Feb 27 17:39:54 crc kubenswrapper[4752]: I0227 17:39:54.357451 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 27 17:39:54 crc kubenswrapper[4752]: I0227 17:39:54.358815 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 27 17:39:54 crc kubenswrapper[4752]: I0227 17:39:54.358814 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 27 17:39:54 crc kubenswrapper[4752]: I0227 17:39:54.359043 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 27 17:39:54 crc kubenswrapper[4752]: I0227 17:39:54.359224 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 27 17:39:54 crc kubenswrapper[4752]: I0227 17:39:54.358946 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 27 17:39:54 crc kubenswrapper[4752]: I0227 17:39:54.362776 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-548694564-hszg5"] Feb 27 17:39:54 crc kubenswrapper[4752]: I0227 17:39:54.502180 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ce533640-e7b2-49f9-b183-4eb32d73e6e9-client-ca\") pod \"route-controller-manager-548694564-hszg5\" (UID: \"ce533640-e7b2-49f9-b183-4eb32d73e6e9\") " pod="openshift-route-controller-manager/route-controller-manager-548694564-hszg5" Feb 27 17:39:54 crc kubenswrapper[4752]: I0227 17:39:54.502216 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce533640-e7b2-49f9-b183-4eb32d73e6e9-serving-cert\") pod \"route-controller-manager-548694564-hszg5\" (UID: \"ce533640-e7b2-49f9-b183-4eb32d73e6e9\") " pod="openshift-route-controller-manager/route-controller-manager-548694564-hszg5" Feb 27 17:39:54 crc kubenswrapper[4752]: I0227 17:39:54.502440 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce533640-e7b2-49f9-b183-4eb32d73e6e9-config\") pod \"route-controller-manager-548694564-hszg5\" (UID: \"ce533640-e7b2-49f9-b183-4eb32d73e6e9\") " pod="openshift-route-controller-manager/route-controller-manager-548694564-hszg5" Feb 27 17:39:54 crc kubenswrapper[4752]: I0227 17:39:54.502518 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wngqd\" (UniqueName: \"kubernetes.io/projected/ce533640-e7b2-49f9-b183-4eb32d73e6e9-kube-api-access-wngqd\") pod \"route-controller-manager-548694564-hszg5\" (UID: \"ce533640-e7b2-49f9-b183-4eb32d73e6e9\") " pod="openshift-route-controller-manager/route-controller-manager-548694564-hszg5" Feb 27 17:39:54 crc kubenswrapper[4752]: I0227 17:39:54.603283 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ce533640-e7b2-49f9-b183-4eb32d73e6e9-client-ca\") pod \"route-controller-manager-548694564-hszg5\" (UID: \"ce533640-e7b2-49f9-b183-4eb32d73e6e9\") " pod="openshift-route-controller-manager/route-controller-manager-548694564-hszg5" Feb 27 17:39:54 crc kubenswrapper[4752]: I0227 17:39:54.603330 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce533640-e7b2-49f9-b183-4eb32d73e6e9-serving-cert\") pod \"route-controller-manager-548694564-hszg5\" (UID: \"ce533640-e7b2-49f9-b183-4eb32d73e6e9\") " pod="openshift-route-controller-manager/route-controller-manager-548694564-hszg5" Feb 27 17:39:54 crc kubenswrapper[4752]: I0227 17:39:54.603405 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce533640-e7b2-49f9-b183-4eb32d73e6e9-config\") pod \"route-controller-manager-548694564-hszg5\" (UID: \"ce533640-e7b2-49f9-b183-4eb32d73e6e9\") " pod="openshift-route-controller-manager/route-controller-manager-548694564-hszg5" Feb 27 17:39:54 crc kubenswrapper[4752]: I0227 17:39:54.603440 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wngqd\" (UniqueName: \"kubernetes.io/projected/ce533640-e7b2-49f9-b183-4eb32d73e6e9-kube-api-access-wngqd\") pod \"route-controller-manager-548694564-hszg5\" (UID: \"ce533640-e7b2-49f9-b183-4eb32d73e6e9\") " pod="openshift-route-controller-manager/route-controller-manager-548694564-hszg5" Feb 27 17:39:54 crc kubenswrapper[4752]: I0227 17:39:54.604937 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ce533640-e7b2-49f9-b183-4eb32d73e6e9-client-ca\") pod \"route-controller-manager-548694564-hszg5\" (UID: \"ce533640-e7b2-49f9-b183-4eb32d73e6e9\") " pod="openshift-route-controller-manager/route-controller-manager-548694564-hszg5" Feb 27 17:39:54 crc kubenswrapper[4752]: I0227 17:39:54.605100 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce533640-e7b2-49f9-b183-4eb32d73e6e9-config\") pod \"route-controller-manager-548694564-hszg5\" (UID: \"ce533640-e7b2-49f9-b183-4eb32d73e6e9\") " pod="openshift-route-controller-manager/route-controller-manager-548694564-hszg5" Feb 27 17:39:54 crc kubenswrapper[4752]: I0227 17:39:54.615089 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce533640-e7b2-49f9-b183-4eb32d73e6e9-serving-cert\") pod \"route-controller-manager-548694564-hszg5\" (UID: \"ce533640-e7b2-49f9-b183-4eb32d73e6e9\") " pod="openshift-route-controller-manager/route-controller-manager-548694564-hszg5" Feb 27 17:39:54 crc kubenswrapper[4752]: I0227 17:39:54.622417 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wngqd\" (UniqueName: \"kubernetes.io/projected/ce533640-e7b2-49f9-b183-4eb32d73e6e9-kube-api-access-wngqd\") pod \"route-controller-manager-548694564-hszg5\" (UID: \"ce533640-e7b2-49f9-b183-4eb32d73e6e9\") " pod="openshift-route-controller-manager/route-controller-manager-548694564-hszg5" Feb 27 17:39:54 crc kubenswrapper[4752]: I0227 17:39:54.706069 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-548694564-hszg5" Feb 27 17:39:55 crc kubenswrapper[4752]: I0227 17:39:55.099681 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-548694564-hszg5"] Feb 27 17:39:55 crc kubenswrapper[4752]: W0227 17:39:55.109644 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce533640_e7b2_49f9_b183_4eb32d73e6e9.slice/crio-f2c973b2d65140289257373b776bd0312e594dcb568b28f6f33fcd019ad64f16 WatchSource:0}: Error finding container f2c973b2d65140289257373b776bd0312e594dcb568b28f6f33fcd019ad64f16: Status 404 returned error can't find the container with id f2c973b2d65140289257373b776bd0312e594dcb568b28f6f33fcd019ad64f16 Feb 27 17:39:55 crc kubenswrapper[4752]: I0227 17:39:55.167976 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-548694564-hszg5" event={"ID":"ce533640-e7b2-49f9-b183-4eb32d73e6e9","Type":"ContainerStarted","Data":"f2c973b2d65140289257373b776bd0312e594dcb568b28f6f33fcd019ad64f16"} Feb 27 17:39:55 crc kubenswrapper[4752]: I0227 17:39:55.170138 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba","Type":"ContainerStarted","Data":"53846f1a1a4722936fbb9592341b2aa3a7b6690e35684dee2c4d104d1aef013c"} Feb 27 17:39:55 crc kubenswrapper[4752]: I0227 17:39:55.170208 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba","Type":"ContainerStarted","Data":"7a9af0f29c36227bf90295a4a21e46c3307a6d70e2fbee8b2a9d857f7a6703d3"} Feb 27 17:39:55 crc kubenswrapper[4752]: I0227 17:39:55.188174 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.18815494 podStartE2EDuration="2.18815494s" podCreationTimestamp="2026-02-27 17:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:55.184172433 +0000 UTC m=+295.090989294" watchObservedRunningTime="2026-02-27 17:39:55.18815494 +0000 UTC m=+295.094971791" Feb 27 17:39:56 crc kubenswrapper[4752]: I0227 17:39:56.176001 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-548694564-hszg5" event={"ID":"ce533640-e7b2-49f9-b183-4eb32d73e6e9","Type":"ContainerStarted","Data":"159b99618110c3b8d7dc3c8a87ecbcbdbc9c7c83daa95836e5f04b6543647a57"} Feb 27 17:39:56 crc kubenswrapper[4752]: I0227 17:39:56.193713 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-548694564-hszg5" podStartSLOduration=8.193698747 podStartE2EDuration="8.193698747s" podCreationTimestamp="2026-02-27 17:39:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:39:56.189598557 +0000 UTC m=+296.096415418" watchObservedRunningTime="2026-02-27 17:39:56.193698747 +0000 UTC m=+296.100515598" Feb 27 17:39:57 crc kubenswrapper[4752]: I0227 17:39:57.181190 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-548694564-hszg5" Feb 27 17:39:57 crc kubenswrapper[4752]: I0227 17:39:57.186979 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-548694564-hszg5" Feb 27 17:39:58 crc kubenswrapper[4752]: E0227 17:39:58.438069 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/community-operator-index@sha256=886ecdbcb5b8f90338063f6476072fab73c2a9a65b9f2b3835b7bd01c69794c1/signature-2: status 500 (Internal Server Error)" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 27 17:39:58 crc kubenswrapper[4752]: E0227 17:39:58.438504 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wrvgh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-b7x2z_openshift-marketplace(899d1101-b4de-4326-b442-6450903b2a30): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/community-operator-index@sha256=886ecdbcb5b8f90338063f6476072fab73c2a9a65b9f2b3835b7bd01c69794c1/signature-2: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 17:39:58 crc kubenswrapper[4752]: E0227 17:39:58.439850 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/community-operator-index@sha256=886ecdbcb5b8f90338063f6476072fab73c2a9a65b9f2b3835b7bd01c69794c1/signature-2: status 500 (Internal Server Error)\"" pod="openshift-marketplace/community-operators-b7x2z" podUID="899d1101-b4de-4326-b442-6450903b2a30" Feb 27 17:40:00 crc kubenswrapper[4752]: I0227 17:40:00.134005 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536900-5b8dn"] Feb 27 17:40:00 crc kubenswrapper[4752]: I0227 17:40:00.135229 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536900-5b8dn" Feb 27 17:40:00 crc kubenswrapper[4752]: I0227 17:40:00.137044 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wtt6d" Feb 27 17:40:00 crc kubenswrapper[4752]: I0227 17:40:00.140579 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536900-5b8dn"] Feb 27 17:40:00 crc kubenswrapper[4752]: I0227 17:40:00.178991 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz6hg\" (UniqueName: \"kubernetes.io/projected/4c7d2d1c-023b-43e1-9015-5b572f4648cf-kube-api-access-pz6hg\") pod \"auto-csr-approver-29536900-5b8dn\" (UID: \"4c7d2d1c-023b-43e1-9015-5b572f4648cf\") " pod="openshift-infra/auto-csr-approver-29536900-5b8dn" Feb 27 17:40:00 crc kubenswrapper[4752]: I0227 17:40:00.280892 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz6hg\" (UniqueName: \"kubernetes.io/projected/4c7d2d1c-023b-43e1-9015-5b572f4648cf-kube-api-access-pz6hg\") pod \"auto-csr-approver-29536900-5b8dn\" (UID: \"4c7d2d1c-023b-43e1-9015-5b572f4648cf\") " pod="openshift-infra/auto-csr-approver-29536900-5b8dn" Feb 27 17:40:00 crc kubenswrapper[4752]: I0227 17:40:00.298046 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz6hg\" (UniqueName: \"kubernetes.io/projected/4c7d2d1c-023b-43e1-9015-5b572f4648cf-kube-api-access-pz6hg\") pod \"auto-csr-approver-29536900-5b8dn\" (UID: \"4c7d2d1c-023b-43e1-9015-5b572f4648cf\") " pod="openshift-infra/auto-csr-approver-29536900-5b8dn" Feb 27 17:40:00 crc kubenswrapper[4752]: I0227 17:40:00.451397 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536900-5b8dn" Feb 27 17:40:00 crc kubenswrapper[4752]: I0227 17:40:00.843692 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536900-5b8dn"] Feb 27 17:40:03 crc kubenswrapper[4752]: I0227 17:40:03.218329 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536900-5b8dn" event={"ID":"4c7d2d1c-023b-43e1-9015-5b572f4648cf","Type":"ContainerStarted","Data":"beae0dafa17616e4e47a6230d0e68a21646412fdc5ad115c0b00792631002ea3"} Feb 27 17:40:03 crc kubenswrapper[4752]: I0227 17:40:03.220568 4752 generic.go:334] "Generic (PLEG): container finished" podID="2db85625-5324-4606-a2f1-740416e8d218" containerID="a8046adc4ec96a80ce78629877f1a8269ff305beba9863bcdffa8d82c4961be1" exitCode=0 Feb 27 17:40:03 crc kubenswrapper[4752]: I0227 17:40:03.220624 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-694vw" event={"ID":"2db85625-5324-4606-a2f1-740416e8d218","Type":"ContainerDied","Data":"a8046adc4ec96a80ce78629877f1a8269ff305beba9863bcdffa8d82c4961be1"} Feb 27 17:40:04 crc kubenswrapper[4752]: E0227 17:40:04.056243 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 27 17:40:04 crc kubenswrapper[4752]: E0227 17:40:04.056416 4752 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 17:40:04 crc kubenswrapper[4752]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 27 17:40:04 crc kubenswrapper[4752]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pz6hg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29536900-5b8dn_openshift-infra(4c7d2d1c-023b-43e1-9015-5b572f4648cf): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error) Feb 27 17:40:04 crc kubenswrapper[4752]: > logger="UnhandledError" Feb 27 17:40:04 crc kubenswrapper[4752]: E0227 17:40:04.058048 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)\"" pod="openshift-infra/auto-csr-approver-29536900-5b8dn" podUID="4c7d2d1c-023b-43e1-9015-5b572f4648cf" Feb 27 17:40:04 crc kubenswrapper[4752]: I0227 17:40:04.228571 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8w4t" event={"ID":"ebdbb722-11b5-43c4-b8dc-8758bbc7164c","Type":"ContainerStarted","Data":"330339f22036915fad436a77ebf48db28ece31cb5269f8e79a29b514b1908f62"} Feb 27 17:40:04 crc kubenswrapper[4752]: I0227 17:40:04.232049 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-694vw" event={"ID":"2db85625-5324-4606-a2f1-740416e8d218","Type":"ContainerStarted","Data":"5a18461110d98b7163f3f5f0ae364690e3afdd3c7ed957cf11680668c71d0435"} Feb 27 17:40:04 crc kubenswrapper[4752]: E0227 17:40:04.233340 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536900-5b8dn" podUID="4c7d2d1c-023b-43e1-9015-5b572f4648cf" Feb 27 17:40:04 crc kubenswrapper[4752]: E0227 17:40:04.267070 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error)" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 27 17:40:04 crc kubenswrapper[4752]: E0227 17:40:04.267221 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-92dcf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-6kwhk_openshift-marketplace(78323811-0abf-4cc6-921c-5d0e56e895a3): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 17:40:04 crc kubenswrapper[4752]: E0227 17:40:04.268429 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error)\"" pod="openshift-marketplace/redhat-marketplace-6kwhk" podUID="78323811-0abf-4cc6-921c-5d0e56e895a3" Feb 27 17:40:04 crc kubenswrapper[4752]: I0227 17:40:04.272187 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-694vw" podStartSLOduration=4.144965458 podStartE2EDuration="54.272173265s" podCreationTimestamp="2026-02-27 17:39:10 +0000 UTC" firstStartedPulling="2026-02-27 17:39:13.615168855 +0000 UTC m=+253.521985706" lastFinishedPulling="2026-02-27 17:40:03.742376632 +0000 UTC m=+303.649193513" observedRunningTime="2026-02-27 17:40:04.270110155 +0000 UTC m=+304.176927006" watchObservedRunningTime="2026-02-27 17:40:04.272173265 +0000 UTC m=+304.178990116" Feb 27 17:40:04 crc kubenswrapper[4752]: E0227 17:40:04.928277 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536898-598km" podUID="cc36acda-9447-479d-b741-c063ecb91f3e" Feb 27 17:40:05 crc kubenswrapper[4752]: E0227 17:40:05.115388 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-operator-index@sha256=340dbaa786c584e5ffe05a0f79571b9c2fe7d16a1a1fb390e5d83b437d7a1ff3/signature-3: status 500 (Internal Server Error)" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 27 17:40:05 crc kubenswrapper[4752]: E0227 17:40:05.115641 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l262m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-qvj4t_openshift-marketplace(137184c7-4f82-4685-89fa-d5152358e216): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-operator-index@sha256=340dbaa786c584e5ffe05a0f79571b9c2fe7d16a1a1fb390e5d83b437d7a1ff3/signature-3: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 17:40:05 crc kubenswrapper[4752]: E0227 17:40:05.117326 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-operator-index@sha256=340dbaa786c584e5ffe05a0f79571b9c2fe7d16a1a1fb390e5d83b437d7a1ff3/signature-3: status 500 (Internal Server Error)\"" pod="openshift-marketplace/redhat-operators-qvj4t" podUID="137184c7-4f82-4685-89fa-d5152358e216" Feb 27 17:40:05 crc kubenswrapper[4752]: I0227 17:40:05.240450 4752 generic.go:334] "Generic (PLEG): container finished" podID="ebdbb722-11b5-43c4-b8dc-8758bbc7164c" containerID="330339f22036915fad436a77ebf48db28ece31cb5269f8e79a29b514b1908f62" exitCode=0 Feb 27 17:40:05 crc kubenswrapper[4752]: I0227 17:40:05.240517 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8w4t" event={"ID":"ebdbb722-11b5-43c4-b8dc-8758bbc7164c","Type":"ContainerDied","Data":"330339f22036915fad436a77ebf48db28ece31cb5269f8e79a29b514b1908f62"} Feb 27 17:40:05 crc kubenswrapper[4752]: E0227 17:40:05.734877 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/community-operator-index@sha256=886ecdbcb5b8f90338063f6476072fab73c2a9a65b9f2b3835b7bd01c69794c1/signature-2: status 500 (Internal Server Error)" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 27 17:40:05 crc kubenswrapper[4752]: E0227 17:40:05.735054 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-btb9f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-dm9bt_openshift-marketplace(cad177e6-5ee1-4884-bb19-b9413b183acc): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/community-operator-index@sha256=886ecdbcb5b8f90338063f6476072fab73c2a9a65b9f2b3835b7bd01c69794c1/signature-2: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 17:40:05 crc kubenswrapper[4752]: E0227 17:40:05.737099 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/community-operator-index@sha256=886ecdbcb5b8f90338063f6476072fab73c2a9a65b9f2b3835b7bd01c69794c1/signature-2: status 500 (Internal Server Error)\"" pod="openshift-marketplace/community-operators-dm9bt" podUID="cad177e6-5ee1-4884-bb19-b9413b183acc" Feb 27 17:40:06 crc kubenswrapper[4752]: E0227 17:40:06.129851 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error)" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 27 17:40:06 crc kubenswrapper[4752]: E0227 17:40:06.130040 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-24ds4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-zhhxr_openshift-marketplace(760298d8-7405-4c9e-b322-b08dbc182da8): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 17:40:06 crc kubenswrapper[4752]: E0227 17:40:06.131226 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error)\"" pod="openshift-marketplace/redhat-marketplace-zhhxr" podUID="760298d8-7405-4c9e-b322-b08dbc182da8" Feb 27 17:40:06 crc kubenswrapper[4752]: I0227 17:40:06.323964 4752 patch_prober.go:28] interesting pod/machine-config-daemon-cm8wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 17:40:06 crc kubenswrapper[4752]: I0227 17:40:06.324344 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 17:40:06 crc kubenswrapper[4752]: I0227 17:40:06.324405 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" Feb 27 17:40:06 crc kubenswrapper[4752]: I0227 17:40:06.325089 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4a78e5a0164b37185d7cf25f07c0ea5a1fd6df0d23a1ff173bf085817d2f672f"} pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 17:40:06 crc kubenswrapper[4752]: I0227 17:40:06.325188 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" containerName="machine-config-daemon" containerID="cri-o://4a78e5a0164b37185d7cf25f07c0ea5a1fd6df0d23a1ff173bf085817d2f672f" gracePeriod=600 Feb 27 17:40:07 crc kubenswrapper[4752]: I0227 17:40:07.253838 4752 generic.go:334] "Generic (PLEG): container finished" podID="53ce186c-640f-4ade-94e1-587c1440fe87" containerID="4a78e5a0164b37185d7cf25f07c0ea5a1fd6df0d23a1ff173bf085817d2f672f" exitCode=0 Feb 27 17:40:07 crc kubenswrapper[4752]: I0227 17:40:07.253895 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" event={"ID":"53ce186c-640f-4ade-94e1-587c1440fe87","Type":"ContainerDied","Data":"4a78e5a0164b37185d7cf25f07c0ea5a1fd6df0d23a1ff173bf085817d2f672f"} Feb 27 17:40:08 crc kubenswrapper[4752]: I0227 17:40:08.263918 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" event={"ID":"53ce186c-640f-4ade-94e1-587c1440fe87","Type":"ContainerStarted","Data":"048d588cdf52639f640933e2d926a86b51d60c9944af1020f69bdb46dab3553d"} Feb 27 17:40:08 crc kubenswrapper[4752]: I0227 17:40:08.441273 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d45888664-sjzsx"] Feb 27 17:40:08 crc kubenswrapper[4752]: I0227 17:40:08.441577 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5d45888664-sjzsx" podUID="599b1126-dacf-42a6-aabd-84f8177774cd" containerName="controller-manager" containerID="cri-o://fb30b0e0e1bf9f8428eaf261c676ea62d599c347110380afbb611de26796eca7" gracePeriod=30 Feb 27 17:40:08 crc kubenswrapper[4752]: I0227 17:40:08.484524 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-548694564-hszg5"] Feb 27 17:40:08 crc kubenswrapper[4752]: I0227 17:40:08.484772 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-548694564-hszg5" podUID="ce533640-e7b2-49f9-b183-4eb32d73e6e9" containerName="route-controller-manager" containerID="cri-o://159b99618110c3b8d7dc3c8a87ecbcbdbc9c7c83daa95836e5f04b6543647a57" gracePeriod=30 Feb 27 17:40:09 crc kubenswrapper[4752]: I0227 17:40:09.142768 4752 ???:1] "http: TLS handshake error from 192.168.126.11:41940: no serving certificate available for the kubelet" Feb 27 17:40:09 crc kubenswrapper[4752]: I0227 17:40:09.278652 4752 generic.go:334] "Generic (PLEG): container finished" podID="599b1126-dacf-42a6-aabd-84f8177774cd" containerID="fb30b0e0e1bf9f8428eaf261c676ea62d599c347110380afbb611de26796eca7" exitCode=0 Feb 27 17:40:09 crc kubenswrapper[4752]: I0227 17:40:09.278708 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d45888664-sjzsx" event={"ID":"599b1126-dacf-42a6-aabd-84f8177774cd","Type":"ContainerDied","Data":"fb30b0e0e1bf9f8428eaf261c676ea62d599c347110380afbb611de26796eca7"} Feb 27 17:40:09 crc kubenswrapper[4752]: I0227 17:40:09.280202 4752 generic.go:334] "Generic (PLEG): container finished" podID="ce533640-e7b2-49f9-b183-4eb32d73e6e9" containerID="159b99618110c3b8d7dc3c8a87ecbcbdbc9c7c83daa95836e5f04b6543647a57" exitCode=0 Feb 27 17:40:09 crc kubenswrapper[4752]: I0227 17:40:09.280810 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-548694564-hszg5" event={"ID":"ce533640-e7b2-49f9-b183-4eb32d73e6e9","Type":"ContainerDied","Data":"159b99618110c3b8d7dc3c8a87ecbcbdbc9c7c83daa95836e5f04b6543647a57"} Feb 27 17:40:09 crc kubenswrapper[4752]: I0227 17:40:09.507396 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-548694564-hszg5" Feb 27 17:40:09 crc kubenswrapper[4752]: I0227 17:40:09.534677 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5589d447db-8zx2p"] Feb 27 17:40:09 crc kubenswrapper[4752]: E0227 17:40:09.534899 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce533640-e7b2-49f9-b183-4eb32d73e6e9" containerName="route-controller-manager" Feb 27 17:40:09 crc kubenswrapper[4752]: I0227 17:40:09.534919 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce533640-e7b2-49f9-b183-4eb32d73e6e9" containerName="route-controller-manager" Feb 27 17:40:09 crc kubenswrapper[4752]: I0227 17:40:09.535202 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce533640-e7b2-49f9-b183-4eb32d73e6e9" containerName="route-controller-manager" Feb 27 17:40:09 crc kubenswrapper[4752]: I0227 17:40:09.535610 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5589d447db-8zx2p" Feb 27 17:40:09 crc kubenswrapper[4752]: I0227 17:40:09.545578 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5589d447db-8zx2p"] Feb 27 17:40:09 crc kubenswrapper[4752]: I0227 17:40:09.621025 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ce533640-e7b2-49f9-b183-4eb32d73e6e9-client-ca\") pod \"ce533640-e7b2-49f9-b183-4eb32d73e6e9\" (UID: \"ce533640-e7b2-49f9-b183-4eb32d73e6e9\") " Feb 27 17:40:09 crc kubenswrapper[4752]: I0227 17:40:09.621111 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce533640-e7b2-49f9-b183-4eb32d73e6e9-config\") pod \"ce533640-e7b2-49f9-b183-4eb32d73e6e9\" (UID: \"ce533640-e7b2-49f9-b183-4eb32d73e6e9\") " Feb 27 17:40:09 crc kubenswrapper[4752]: I0227 17:40:09.621132 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce533640-e7b2-49f9-b183-4eb32d73e6e9-serving-cert\") pod \"ce533640-e7b2-49f9-b183-4eb32d73e6e9\" (UID: \"ce533640-e7b2-49f9-b183-4eb32d73e6e9\") " Feb 27 17:40:09 crc kubenswrapper[4752]: I0227 17:40:09.621222 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wngqd\" (UniqueName: \"kubernetes.io/projected/ce533640-e7b2-49f9-b183-4eb32d73e6e9-kube-api-access-wngqd\") pod \"ce533640-e7b2-49f9-b183-4eb32d73e6e9\" (UID: \"ce533640-e7b2-49f9-b183-4eb32d73e6e9\") " Feb 27 17:40:09 crc kubenswrapper[4752]: I0227 17:40:09.621336 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9534fc15-28d5-4739-a207-0b657411460b-config\") pod \"route-controller-manager-5589d447db-8zx2p\" (UID: \"9534fc15-28d5-4739-a207-0b657411460b\") " pod="openshift-route-controller-manager/route-controller-manager-5589d447db-8zx2p" Feb 27 17:40:09 crc kubenswrapper[4752]: I0227 17:40:09.621358 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9534fc15-28d5-4739-a207-0b657411460b-client-ca\") pod \"route-controller-manager-5589d447db-8zx2p\" (UID: \"9534fc15-28d5-4739-a207-0b657411460b\") " pod="openshift-route-controller-manager/route-controller-manager-5589d447db-8zx2p" Feb 27 17:40:09 crc kubenswrapper[4752]: I0227 17:40:09.621384 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9534fc15-28d5-4739-a207-0b657411460b-serving-cert\") pod \"route-controller-manager-5589d447db-8zx2p\" (UID: \"9534fc15-28d5-4739-a207-0b657411460b\") " pod="openshift-route-controller-manager/route-controller-manager-5589d447db-8zx2p" Feb 27 17:40:09 crc kubenswrapper[4752]: I0227 17:40:09.621453 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jzc7\" (UniqueName: \"kubernetes.io/projected/9534fc15-28d5-4739-a207-0b657411460b-kube-api-access-8jzc7\") pod \"route-controller-manager-5589d447db-8zx2p\" (UID: \"9534fc15-28d5-4739-a207-0b657411460b\") " pod="openshift-route-controller-manager/route-controller-manager-5589d447db-8zx2p" Feb 27 17:40:09 crc kubenswrapper[4752]: I0227 17:40:09.621929 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce533640-e7b2-49f9-b183-4eb32d73e6e9-client-ca" (OuterVolumeSpecName: "client-ca") pod "ce533640-e7b2-49f9-b183-4eb32d73e6e9" (UID: "ce533640-e7b2-49f9-b183-4eb32d73e6e9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:40:09 crc kubenswrapper[4752]: I0227 17:40:09.622936 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce533640-e7b2-49f9-b183-4eb32d73e6e9-config" (OuterVolumeSpecName: "config") pod "ce533640-e7b2-49f9-b183-4eb32d73e6e9" (UID: "ce533640-e7b2-49f9-b183-4eb32d73e6e9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:40:09 crc kubenswrapper[4752]: I0227 17:40:09.630735 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce533640-e7b2-49f9-b183-4eb32d73e6e9-kube-api-access-wngqd" (OuterVolumeSpecName: "kube-api-access-wngqd") pod "ce533640-e7b2-49f9-b183-4eb32d73e6e9" (UID: "ce533640-e7b2-49f9-b183-4eb32d73e6e9"). InnerVolumeSpecName "kube-api-access-wngqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:40:09 crc kubenswrapper[4752]: I0227 17:40:09.631413 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce533640-e7b2-49f9-b183-4eb32d73e6e9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ce533640-e7b2-49f9-b183-4eb32d73e6e9" (UID: "ce533640-e7b2-49f9-b183-4eb32d73e6e9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:40:09 crc kubenswrapper[4752]: I0227 17:40:09.723131 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jzc7\" (UniqueName: \"kubernetes.io/projected/9534fc15-28d5-4739-a207-0b657411460b-kube-api-access-8jzc7\") pod \"route-controller-manager-5589d447db-8zx2p\" (UID: \"9534fc15-28d5-4739-a207-0b657411460b\") " pod="openshift-route-controller-manager/route-controller-manager-5589d447db-8zx2p" Feb 27 17:40:09 crc kubenswrapper[4752]: I0227 17:40:09.723235 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9534fc15-28d5-4739-a207-0b657411460b-config\") pod \"route-controller-manager-5589d447db-8zx2p\" (UID: \"9534fc15-28d5-4739-a207-0b657411460b\") " pod="openshift-route-controller-manager/route-controller-manager-5589d447db-8zx2p" Feb 27 17:40:09 crc kubenswrapper[4752]: I0227 17:40:09.723255 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9534fc15-28d5-4739-a207-0b657411460b-client-ca\") pod \"route-controller-manager-5589d447db-8zx2p\" (UID: \"9534fc15-28d5-4739-a207-0b657411460b\") " pod="openshift-route-controller-manager/route-controller-manager-5589d447db-8zx2p" Feb 27 17:40:09 crc kubenswrapper[4752]: I0227 17:40:09.724523 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9534fc15-28d5-4739-a207-0b657411460b-serving-cert\") pod \"route-controller-manager-5589d447db-8zx2p\" (UID: \"9534fc15-28d5-4739-a207-0b657411460b\") " pod="openshift-route-controller-manager/route-controller-manager-5589d447db-8zx2p" Feb 27 17:40:09 crc kubenswrapper[4752]: I0227 17:40:09.724748 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce533640-e7b2-49f9-b183-4eb32d73e6e9-config\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:09 crc kubenswrapper[4752]: I0227 17:40:09.724765 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce533640-e7b2-49f9-b183-4eb32d73e6e9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:09 crc kubenswrapper[4752]: I0227 17:40:09.724779 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wngqd\" (UniqueName: \"kubernetes.io/projected/ce533640-e7b2-49f9-b183-4eb32d73e6e9-kube-api-access-wngqd\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:09 crc kubenswrapper[4752]: I0227 17:40:09.724792 4752 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ce533640-e7b2-49f9-b183-4eb32d73e6e9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:09 crc kubenswrapper[4752]: I0227 17:40:09.724835 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9534fc15-28d5-4739-a207-0b657411460b-config\") pod \"route-controller-manager-5589d447db-8zx2p\" (UID: \"9534fc15-28d5-4739-a207-0b657411460b\") " pod="openshift-route-controller-manager/route-controller-manager-5589d447db-8zx2p" Feb 27 17:40:09 crc kubenswrapper[4752]: I0227 17:40:09.724885 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9534fc15-28d5-4739-a207-0b657411460b-client-ca\") pod \"route-controller-manager-5589d447db-8zx2p\" (UID: \"9534fc15-28d5-4739-a207-0b657411460b\") " pod="openshift-route-controller-manager/route-controller-manager-5589d447db-8zx2p" Feb 27 17:40:09 crc kubenswrapper[4752]: I0227 17:40:09.728835 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9534fc15-28d5-4739-a207-0b657411460b-serving-cert\") pod \"route-controller-manager-5589d447db-8zx2p\" (UID: \"9534fc15-28d5-4739-a207-0b657411460b\") " pod="openshift-route-controller-manager/route-controller-manager-5589d447db-8zx2p" Feb 27 17:40:09 crc kubenswrapper[4752]: I0227 17:40:09.753035 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jzc7\" (UniqueName: \"kubernetes.io/projected/9534fc15-28d5-4739-a207-0b657411460b-kube-api-access-8jzc7\") pod \"route-controller-manager-5589d447db-8zx2p\" (UID: \"9534fc15-28d5-4739-a207-0b657411460b\") " pod="openshift-route-controller-manager/route-controller-manager-5589d447db-8zx2p" Feb 27 17:40:09 crc kubenswrapper[4752]: I0227 17:40:09.893736 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5589d447db-8zx2p" Feb 27 17:40:10 crc kubenswrapper[4752]: I0227 17:40:10.287430 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8w4t" event={"ID":"ebdbb722-11b5-43c4-b8dc-8758bbc7164c","Type":"ContainerStarted","Data":"8aeb418c116a39ee31cd0026e29f7c590cba724acc16d7b4a09d9ae052d82151"} Feb 27 17:40:10 crc kubenswrapper[4752]: I0227 17:40:10.289524 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cllc6" event={"ID":"049892e0-329b-442b-b232-997ee454f9c6","Type":"ContainerStarted","Data":"01da1ce333dfd3dcda701b9059d039a737a4049c29307747c4b9cd58800cf038"} Feb 27 17:40:10 crc kubenswrapper[4752]: I0227 17:40:10.296038 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-548694564-hszg5" event={"ID":"ce533640-e7b2-49f9-b183-4eb32d73e6e9","Type":"ContainerDied","Data":"f2c973b2d65140289257373b776bd0312e594dcb568b28f6f33fcd019ad64f16"} Feb 27 17:40:10 crc kubenswrapper[4752]: I0227 17:40:10.296080 4752 scope.go:117] "RemoveContainer" containerID="159b99618110c3b8d7dc3c8a87ecbcbdbc9c7c83daa95836e5f04b6543647a57" Feb 27 17:40:10 crc kubenswrapper[4752]: I0227 17:40:10.296200 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-548694564-hszg5" Feb 27 17:40:10 crc kubenswrapper[4752]: I0227 17:40:10.304454 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t8w4t" podStartSLOduration=4.798907626 podStartE2EDuration="1m0.30443189s" podCreationTimestamp="2026-02-27 17:39:10 +0000 UTC" firstStartedPulling="2026-02-27 17:39:13.577950656 +0000 UTC m=+253.484767507" lastFinishedPulling="2026-02-27 17:40:09.08347492 +0000 UTC m=+308.990291771" observedRunningTime="2026-02-27 17:40:10.303320503 +0000 UTC m=+310.210137354" watchObservedRunningTime="2026-02-27 17:40:10.30443189 +0000 UTC m=+310.211248741" Feb 27 17:40:10 crc kubenswrapper[4752]: I0227 17:40:10.344706 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-548694564-hszg5"] Feb 27 17:40:10 crc kubenswrapper[4752]: I0227 17:40:10.347995 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-548694564-hszg5"] Feb 27 17:40:10 crc kubenswrapper[4752]: I0227 17:40:10.428184 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d45888664-sjzsx" Feb 27 17:40:10 crc kubenswrapper[4752]: I0227 17:40:10.435181 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5589d447db-8zx2p"] Feb 27 17:40:10 crc kubenswrapper[4752]: W0227 17:40:10.437978 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9534fc15_28d5_4739_a207_0b657411460b.slice/crio-0d0adfe6fc095f25c983e92c9bb2e947e5b491a9a67f3f47a7061ee31e35b62a WatchSource:0}: Error finding container 0d0adfe6fc095f25c983e92c9bb2e947e5b491a9a67f3f47a7061ee31e35b62a: Status 404 returned error can't find the container with id 0d0adfe6fc095f25c983e92c9bb2e947e5b491a9a67f3f47a7061ee31e35b62a Feb 27 17:40:10 crc kubenswrapper[4752]: I0227 17:40:10.531402 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/599b1126-dacf-42a6-aabd-84f8177774cd-serving-cert\") pod \"599b1126-dacf-42a6-aabd-84f8177774cd\" (UID: \"599b1126-dacf-42a6-aabd-84f8177774cd\") " Feb 27 17:40:10 crc kubenswrapper[4752]: I0227 17:40:10.531482 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24p2j\" (UniqueName: \"kubernetes.io/projected/599b1126-dacf-42a6-aabd-84f8177774cd-kube-api-access-24p2j\") pod \"599b1126-dacf-42a6-aabd-84f8177774cd\" (UID: \"599b1126-dacf-42a6-aabd-84f8177774cd\") " Feb 27 17:40:10 crc kubenswrapper[4752]: I0227 17:40:10.531536 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/599b1126-dacf-42a6-aabd-84f8177774cd-config\") pod \"599b1126-dacf-42a6-aabd-84f8177774cd\" (UID: \"599b1126-dacf-42a6-aabd-84f8177774cd\") " Feb 27 17:40:10 crc kubenswrapper[4752]: I0227 17:40:10.531574 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/599b1126-dacf-42a6-aabd-84f8177774cd-client-ca\") pod \"599b1126-dacf-42a6-aabd-84f8177774cd\" (UID: \"599b1126-dacf-42a6-aabd-84f8177774cd\") " Feb 27 17:40:10 crc kubenswrapper[4752]: I0227 17:40:10.531609 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/599b1126-dacf-42a6-aabd-84f8177774cd-proxy-ca-bundles\") pod \"599b1126-dacf-42a6-aabd-84f8177774cd\" (UID: \"599b1126-dacf-42a6-aabd-84f8177774cd\") " Feb 27 17:40:10 crc kubenswrapper[4752]: I0227 17:40:10.537530 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/599b1126-dacf-42a6-aabd-84f8177774cd-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "599b1126-dacf-42a6-aabd-84f8177774cd" (UID: "599b1126-dacf-42a6-aabd-84f8177774cd"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:40:10 crc kubenswrapper[4752]: I0227 17:40:10.537910 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/599b1126-dacf-42a6-aabd-84f8177774cd-client-ca" (OuterVolumeSpecName: "client-ca") pod "599b1126-dacf-42a6-aabd-84f8177774cd" (UID: "599b1126-dacf-42a6-aabd-84f8177774cd"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:40:10 crc kubenswrapper[4752]: I0227 17:40:10.544101 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/599b1126-dacf-42a6-aabd-84f8177774cd-config" (OuterVolumeSpecName: "config") pod "599b1126-dacf-42a6-aabd-84f8177774cd" (UID: "599b1126-dacf-42a6-aabd-84f8177774cd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:40:10 crc kubenswrapper[4752]: I0227 17:40:10.548256 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/599b1126-dacf-42a6-aabd-84f8177774cd-kube-api-access-24p2j" (OuterVolumeSpecName: "kube-api-access-24p2j") pod "599b1126-dacf-42a6-aabd-84f8177774cd" (UID: "599b1126-dacf-42a6-aabd-84f8177774cd"). InnerVolumeSpecName "kube-api-access-24p2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:40:10 crc kubenswrapper[4752]: I0227 17:40:10.553403 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/599b1126-dacf-42a6-aabd-84f8177774cd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "599b1126-dacf-42a6-aabd-84f8177774cd" (UID: "599b1126-dacf-42a6-aabd-84f8177774cd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:40:10 crc kubenswrapper[4752]: I0227 17:40:10.632710 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24p2j\" (UniqueName: \"kubernetes.io/projected/599b1126-dacf-42a6-aabd-84f8177774cd-kube-api-access-24p2j\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:10 crc kubenswrapper[4752]: I0227 17:40:10.632749 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/599b1126-dacf-42a6-aabd-84f8177774cd-config\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:10 crc kubenswrapper[4752]: I0227 17:40:10.632759 4752 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/599b1126-dacf-42a6-aabd-84f8177774cd-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:10 crc kubenswrapper[4752]: I0227 17:40:10.632770 4752 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/599b1126-dacf-42a6-aabd-84f8177774cd-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:10 crc kubenswrapper[4752]: I0227 17:40:10.632778 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/599b1126-dacf-42a6-aabd-84f8177774cd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:10 crc kubenswrapper[4752]: I0227 17:40:10.796576 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-t8w4t" Feb 27 17:40:10 crc kubenswrapper[4752]: I0227 17:40:10.796621 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t8w4t" Feb 27 17:40:10 crc kubenswrapper[4752]: I0227 17:40:10.913886 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce533640-e7b2-49f9-b183-4eb32d73e6e9" path="/var/lib/kubelet/pods/ce533640-e7b2-49f9-b183-4eb32d73e6e9/volumes" Feb 27 17:40:11 crc kubenswrapper[4752]: I0227 17:40:11.213088 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-694vw" Feb 27 17:40:11 crc kubenswrapper[4752]: I0227 17:40:11.213418 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-694vw" Feb 27 17:40:11 crc kubenswrapper[4752]: I0227 17:40:11.268057 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-694vw" Feb 27 17:40:11 crc kubenswrapper[4752]: I0227 17:40:11.308727 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d45888664-sjzsx" event={"ID":"599b1126-dacf-42a6-aabd-84f8177774cd","Type":"ContainerDied","Data":"0e26009d696737b0f4c09d2a69f038a2bda3893810f7853be825276744a892df"} Feb 27 17:40:11 crc kubenswrapper[4752]: I0227 17:40:11.308778 4752 scope.go:117] "RemoveContainer" containerID="fb30b0e0e1bf9f8428eaf261c676ea62d599c347110380afbb611de26796eca7" Feb 27 17:40:11 crc kubenswrapper[4752]: I0227 17:40:11.308812 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d45888664-sjzsx" Feb 27 17:40:11 crc kubenswrapper[4752]: I0227 17:40:11.311136 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5589d447db-8zx2p" event={"ID":"9534fc15-28d5-4739-a207-0b657411460b","Type":"ContainerStarted","Data":"06d4e84ec3a4dbef2d1bbf568fa725299c7a31b95d0052c693ec7ecd2e747b59"} Feb 27 17:40:11 crc kubenswrapper[4752]: I0227 17:40:11.311213 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5589d447db-8zx2p" event={"ID":"9534fc15-28d5-4739-a207-0b657411460b","Type":"ContainerStarted","Data":"0d0adfe6fc095f25c983e92c9bb2e947e5b491a9a67f3f47a7061ee31e35b62a"} Feb 27 17:40:11 crc kubenswrapper[4752]: I0227 17:40:11.312203 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5589d447db-8zx2p" Feb 27 17:40:11 crc kubenswrapper[4752]: I0227 17:40:11.316622 4752 generic.go:334] "Generic (PLEG): container finished" podID="049892e0-329b-442b-b232-997ee454f9c6" containerID="01da1ce333dfd3dcda701b9059d039a737a4049c29307747c4b9cd58800cf038" exitCode=0 Feb 27 17:40:11 crc kubenswrapper[4752]: I0227 17:40:11.316736 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cllc6" event={"ID":"049892e0-329b-442b-b232-997ee454f9c6","Type":"ContainerDied","Data":"01da1ce333dfd3dcda701b9059d039a737a4049c29307747c4b9cd58800cf038"} Feb 27 17:40:11 crc kubenswrapper[4752]: I0227 17:40:11.317709 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5589d447db-8zx2p" Feb 27 17:40:11 crc kubenswrapper[4752]: I0227 17:40:11.353776 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5589d447db-8zx2p" podStartSLOduration=3.3537552489999998 podStartE2EDuration="3.353755249s" podCreationTimestamp="2026-02-27 17:40:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:40:11.342583098 +0000 UTC m=+311.249399989" watchObservedRunningTime="2026-02-27 17:40:11.353755249 +0000 UTC m=+311.260572110" Feb 27 17:40:11 crc kubenswrapper[4752]: I0227 17:40:11.358262 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d45888664-sjzsx"] Feb 27 17:40:11 crc kubenswrapper[4752]: I0227 17:40:11.362259 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5d45888664-sjzsx"] Feb 27 17:40:11 crc kubenswrapper[4752]: I0227 17:40:11.368977 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-694vw" Feb 27 17:40:11 crc kubenswrapper[4752]: I0227 17:40:11.949260 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-694vw"] Feb 27 17:40:11 crc kubenswrapper[4752]: I0227 17:40:11.976687 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-t8w4t" podUID="ebdbb722-11b5-43c4-b8dc-8758bbc7164c" containerName="registry-server" probeResult="failure" output=< Feb 27 17:40:11 crc kubenswrapper[4752]: timeout: failed to connect service ":50051" within 1s Feb 27 17:40:11 crc kubenswrapper[4752]: > Feb 27 17:40:12 crc kubenswrapper[4752]: I0227 17:40:12.325076 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cllc6" event={"ID":"049892e0-329b-442b-b232-997ee454f9c6","Type":"ContainerStarted","Data":"0fc93b53f02a726982f59c05dd78edcc2b73d771891e7dc9366413374a0da2b8"} Feb 27 17:40:12 crc kubenswrapper[4752]: I0227 17:40:12.350766 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cllc6" podStartSLOduration=3.324751901 podStartE2EDuration="59.350744488s" podCreationTimestamp="2026-02-27 17:39:13 +0000 UTC" firstStartedPulling="2026-02-27 17:39:15.794082456 +0000 UTC m=+255.700899307" lastFinishedPulling="2026-02-27 17:40:11.820075033 +0000 UTC m=+311.726891894" observedRunningTime="2026-02-27 17:40:12.349328683 +0000 UTC m=+312.256145574" watchObservedRunningTime="2026-02-27 17:40:12.350744488 +0000 UTC m=+312.257561349" Feb 27 17:40:12 crc kubenswrapper[4752]: I0227 17:40:12.368792 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6d9c88f797-h9v2p"] Feb 27 17:40:12 crc kubenswrapper[4752]: E0227 17:40:12.369228 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="599b1126-dacf-42a6-aabd-84f8177774cd" containerName="controller-manager" Feb 27 17:40:12 crc kubenswrapper[4752]: I0227 17:40:12.369265 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="599b1126-dacf-42a6-aabd-84f8177774cd" containerName="controller-manager" Feb 27 17:40:12 crc kubenswrapper[4752]: I0227 17:40:12.369494 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="599b1126-dacf-42a6-aabd-84f8177774cd" containerName="controller-manager" Feb 27 17:40:12 crc kubenswrapper[4752]: I0227 17:40:12.370112 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d9c88f797-h9v2p" Feb 27 17:40:12 crc kubenswrapper[4752]: I0227 17:40:12.372328 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 27 17:40:12 crc kubenswrapper[4752]: I0227 17:40:12.372706 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 27 17:40:12 crc kubenswrapper[4752]: I0227 17:40:12.373302 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 27 17:40:12 crc kubenswrapper[4752]: I0227 17:40:12.373871 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 27 17:40:12 crc kubenswrapper[4752]: I0227 17:40:12.373904 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 27 17:40:12 crc kubenswrapper[4752]: I0227 17:40:12.374270 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 27 17:40:12 crc kubenswrapper[4752]: I0227 17:40:12.381356 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 27 17:40:12 crc kubenswrapper[4752]: I0227 17:40:12.388023 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6d9c88f797-h9v2p"] Feb 27 17:40:12 crc kubenswrapper[4752]: I0227 17:40:12.558105 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/630c3496-e651-4923-869a-85ab326b2a36-serving-cert\") pod \"controller-manager-6d9c88f797-h9v2p\" (UID: \"630c3496-e651-4923-869a-85ab326b2a36\") " pod="openshift-controller-manager/controller-manager-6d9c88f797-h9v2p" Feb 27 17:40:12 crc kubenswrapper[4752]: I0227 17:40:12.558165 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk58k\" (UniqueName: \"kubernetes.io/projected/630c3496-e651-4923-869a-85ab326b2a36-kube-api-access-xk58k\") pod \"controller-manager-6d9c88f797-h9v2p\" (UID: \"630c3496-e651-4923-869a-85ab326b2a36\") " pod="openshift-controller-manager/controller-manager-6d9c88f797-h9v2p" Feb 27 17:40:12 crc kubenswrapper[4752]: I0227 17:40:12.558191 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/630c3496-e651-4923-869a-85ab326b2a36-client-ca\") pod \"controller-manager-6d9c88f797-h9v2p\" (UID: \"630c3496-e651-4923-869a-85ab326b2a36\") " pod="openshift-controller-manager/controller-manager-6d9c88f797-h9v2p" Feb 27 17:40:12 crc kubenswrapper[4752]: I0227 17:40:12.558216 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/630c3496-e651-4923-869a-85ab326b2a36-proxy-ca-bundles\") pod \"controller-manager-6d9c88f797-h9v2p\" (UID: \"630c3496-e651-4923-869a-85ab326b2a36\") " pod="openshift-controller-manager/controller-manager-6d9c88f797-h9v2p" Feb 27 17:40:12 crc kubenswrapper[4752]: I0227 17:40:12.558453 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/630c3496-e651-4923-869a-85ab326b2a36-config\") pod \"controller-manager-6d9c88f797-h9v2p\" (UID: \"630c3496-e651-4923-869a-85ab326b2a36\") " pod="openshift-controller-manager/controller-manager-6d9c88f797-h9v2p" Feb 27 17:40:12 crc kubenswrapper[4752]: I0227 17:40:12.659476 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/630c3496-e651-4923-869a-85ab326b2a36-client-ca\") pod \"controller-manager-6d9c88f797-h9v2p\" (UID: \"630c3496-e651-4923-869a-85ab326b2a36\") " pod="openshift-controller-manager/controller-manager-6d9c88f797-h9v2p" Feb 27 17:40:12 crc kubenswrapper[4752]: I0227 17:40:12.659542 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk58k\" (UniqueName: \"kubernetes.io/projected/630c3496-e651-4923-869a-85ab326b2a36-kube-api-access-xk58k\") pod \"controller-manager-6d9c88f797-h9v2p\" (UID: \"630c3496-e651-4923-869a-85ab326b2a36\") " pod="openshift-controller-manager/controller-manager-6d9c88f797-h9v2p" Feb 27 17:40:12 crc kubenswrapper[4752]: I0227 17:40:12.659593 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/630c3496-e651-4923-869a-85ab326b2a36-proxy-ca-bundles\") pod \"controller-manager-6d9c88f797-h9v2p\" (UID: \"630c3496-e651-4923-869a-85ab326b2a36\") " pod="openshift-controller-manager/controller-manager-6d9c88f797-h9v2p" Feb 27 17:40:12 crc kubenswrapper[4752]: I0227 17:40:12.659695 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/630c3496-e651-4923-869a-85ab326b2a36-config\") pod \"controller-manager-6d9c88f797-h9v2p\" (UID: \"630c3496-e651-4923-869a-85ab326b2a36\") " pod="openshift-controller-manager/controller-manager-6d9c88f797-h9v2p" Feb 27 17:40:12 crc kubenswrapper[4752]: I0227 17:40:12.659769 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/630c3496-e651-4923-869a-85ab326b2a36-serving-cert\") pod \"controller-manager-6d9c88f797-h9v2p\" (UID: \"630c3496-e651-4923-869a-85ab326b2a36\") " pod="openshift-controller-manager/controller-manager-6d9c88f797-h9v2p" Feb 27 17:40:12 crc kubenswrapper[4752]: I0227 17:40:12.662207 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/630c3496-e651-4923-869a-85ab326b2a36-proxy-ca-bundles\") pod \"controller-manager-6d9c88f797-h9v2p\" (UID: \"630c3496-e651-4923-869a-85ab326b2a36\") " pod="openshift-controller-manager/controller-manager-6d9c88f797-h9v2p" Feb 27 17:40:12 crc kubenswrapper[4752]: I0227 17:40:12.662244 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/630c3496-e651-4923-869a-85ab326b2a36-config\") pod \"controller-manager-6d9c88f797-h9v2p\" (UID: \"630c3496-e651-4923-869a-85ab326b2a36\") " pod="openshift-controller-manager/controller-manager-6d9c88f797-h9v2p" Feb 27 17:40:12 crc kubenswrapper[4752]: I0227 17:40:12.665523 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/630c3496-e651-4923-869a-85ab326b2a36-serving-cert\") pod \"controller-manager-6d9c88f797-h9v2p\" (UID: \"630c3496-e651-4923-869a-85ab326b2a36\") " pod="openshift-controller-manager/controller-manager-6d9c88f797-h9v2p" Feb 27 17:40:12 crc kubenswrapper[4752]: I0227 17:40:12.673451 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/630c3496-e651-4923-869a-85ab326b2a36-client-ca\") pod \"controller-manager-6d9c88f797-h9v2p\" (UID: \"630c3496-e651-4923-869a-85ab326b2a36\") " pod="openshift-controller-manager/controller-manager-6d9c88f797-h9v2p" Feb 27 17:40:12 crc kubenswrapper[4752]: I0227 17:40:12.680795 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk58k\" (UniqueName: \"kubernetes.io/projected/630c3496-e651-4923-869a-85ab326b2a36-kube-api-access-xk58k\") pod \"controller-manager-6d9c88f797-h9v2p\" (UID: \"630c3496-e651-4923-869a-85ab326b2a36\") " pod="openshift-controller-manager/controller-manager-6d9c88f797-h9v2p" Feb 27 17:40:12 crc kubenswrapper[4752]: I0227 17:40:12.693526 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d9c88f797-h9v2p" Feb 27 17:40:12 crc kubenswrapper[4752]: E0227 17:40:12.910522 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-b7x2z" podUID="899d1101-b4de-4326-b442-6450903b2a30" Feb 27 17:40:12 crc kubenswrapper[4752]: I0227 17:40:12.918345 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="599b1126-dacf-42a6-aabd-84f8177774cd" path="/var/lib/kubelet/pods/599b1126-dacf-42a6-aabd-84f8177774cd/volumes" Feb 27 17:40:13 crc kubenswrapper[4752]: I0227 17:40:13.116050 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6d9c88f797-h9v2p"] Feb 27 17:40:13 crc kubenswrapper[4752]: I0227 17:40:13.338280 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d9c88f797-h9v2p" event={"ID":"630c3496-e651-4923-869a-85ab326b2a36","Type":"ContainerStarted","Data":"2beb1f23929946c6847396243f4e3bab90033a98d8842a35cd0d23a9dffbc45b"} Feb 27 17:40:13 crc kubenswrapper[4752]: I0227 17:40:13.338671 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d9c88f797-h9v2p" event={"ID":"630c3496-e651-4923-869a-85ab326b2a36","Type":"ContainerStarted","Data":"8ae7b1f53fef38dc7581c346ee84d8c69fe044d89bdc7c93dfcad2c989730ca4"} Feb 27 17:40:13 crc kubenswrapper[4752]: I0227 17:40:13.338561 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-694vw" podUID="2db85625-5324-4606-a2f1-740416e8d218" containerName="registry-server" containerID="cri-o://5a18461110d98b7163f3f5f0ae364690e3afdd3c7ed957cf11680668c71d0435" gracePeriod=2 Feb 27 17:40:13 crc kubenswrapper[4752]: I0227 17:40:13.807891 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-694vw" Feb 27 17:40:13 crc kubenswrapper[4752]: I0227 17:40:13.830716 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6d9c88f797-h9v2p" podStartSLOduration=5.830698435 podStartE2EDuration="5.830698435s" podCreationTimestamp="2026-02-27 17:40:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:40:13.367836414 +0000 UTC m=+313.274653265" watchObservedRunningTime="2026-02-27 17:40:13.830698435 +0000 UTC m=+313.737515276" Feb 27 17:40:13 crc kubenswrapper[4752]: I0227 17:40:13.863181 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cllc6" Feb 27 17:40:13 crc kubenswrapper[4752]: I0227 17:40:13.863230 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cllc6" Feb 27 17:40:13 crc kubenswrapper[4752]: I0227 17:40:13.978355 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjlzt\" (UniqueName: \"kubernetes.io/projected/2db85625-5324-4606-a2f1-740416e8d218-kube-api-access-sjlzt\") pod \"2db85625-5324-4606-a2f1-740416e8d218\" (UID: \"2db85625-5324-4606-a2f1-740416e8d218\") " Feb 27 17:40:13 crc kubenswrapper[4752]: I0227 17:40:13.978537 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2db85625-5324-4606-a2f1-740416e8d218-catalog-content\") pod \"2db85625-5324-4606-a2f1-740416e8d218\" (UID: \"2db85625-5324-4606-a2f1-740416e8d218\") " Feb 27 17:40:13 crc kubenswrapper[4752]: I0227 17:40:13.978637 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2db85625-5324-4606-a2f1-740416e8d218-utilities\") pod \"2db85625-5324-4606-a2f1-740416e8d218\" (UID: \"2db85625-5324-4606-a2f1-740416e8d218\") " Feb 27 17:40:13 crc kubenswrapper[4752]: I0227 17:40:13.979364 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2db85625-5324-4606-a2f1-740416e8d218-utilities" (OuterVolumeSpecName: "utilities") pod "2db85625-5324-4606-a2f1-740416e8d218" (UID: "2db85625-5324-4606-a2f1-740416e8d218"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 17:40:13 crc kubenswrapper[4752]: I0227 17:40:13.985179 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2db85625-5324-4606-a2f1-740416e8d218-kube-api-access-sjlzt" (OuterVolumeSpecName: "kube-api-access-sjlzt") pod "2db85625-5324-4606-a2f1-740416e8d218" (UID: "2db85625-5324-4606-a2f1-740416e8d218"). InnerVolumeSpecName "kube-api-access-sjlzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:40:14 crc kubenswrapper[4752]: I0227 17:40:14.035801 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2db85625-5324-4606-a2f1-740416e8d218-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2db85625-5324-4606-a2f1-740416e8d218" (UID: "2db85625-5324-4606-a2f1-740416e8d218"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 17:40:14 crc kubenswrapper[4752]: I0227 17:40:14.080111 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjlzt\" (UniqueName: \"kubernetes.io/projected/2db85625-5324-4606-a2f1-740416e8d218-kube-api-access-sjlzt\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:14 crc kubenswrapper[4752]: I0227 17:40:14.080163 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2db85625-5324-4606-a2f1-740416e8d218-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:14 crc kubenswrapper[4752]: I0227 17:40:14.080180 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2db85625-5324-4606-a2f1-740416e8d218-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:14 crc kubenswrapper[4752]: I0227 17:40:14.345428 4752 generic.go:334] "Generic (PLEG): container finished" podID="2db85625-5324-4606-a2f1-740416e8d218" containerID="5a18461110d98b7163f3f5f0ae364690e3afdd3c7ed957cf11680668c71d0435" exitCode=0 Feb 27 17:40:14 crc kubenswrapper[4752]: I0227 17:40:14.345480 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-694vw" Feb 27 17:40:14 crc kubenswrapper[4752]: I0227 17:40:14.345504 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-694vw" event={"ID":"2db85625-5324-4606-a2f1-740416e8d218","Type":"ContainerDied","Data":"5a18461110d98b7163f3f5f0ae364690e3afdd3c7ed957cf11680668c71d0435"} Feb 27 17:40:14 crc kubenswrapper[4752]: I0227 17:40:14.345558 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-694vw" event={"ID":"2db85625-5324-4606-a2f1-740416e8d218","Type":"ContainerDied","Data":"4e5af055ad8d648a0e0c4f942e2f556bd16aa6a76022acf5bc40b130a1a9c514"} Feb 27 17:40:14 crc kubenswrapper[4752]: I0227 17:40:14.345577 4752 scope.go:117] "RemoveContainer" containerID="5a18461110d98b7163f3f5f0ae364690e3afdd3c7ed957cf11680668c71d0435" Feb 27 17:40:14 crc kubenswrapper[4752]: I0227 17:40:14.346043 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6d9c88f797-h9v2p" Feb 27 17:40:14 crc kubenswrapper[4752]: I0227 17:40:14.360255 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6d9c88f797-h9v2p" Feb 27 17:40:14 crc kubenswrapper[4752]: I0227 17:40:14.361900 4752 scope.go:117] "RemoveContainer" containerID="a8046adc4ec96a80ce78629877f1a8269ff305beba9863bcdffa8d82c4961be1" Feb 27 17:40:14 crc kubenswrapper[4752]: I0227 17:40:14.376445 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-694vw"] Feb 27 17:40:14 crc kubenswrapper[4752]: I0227 17:40:14.380684 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-694vw"] Feb 27 17:40:14 crc kubenswrapper[4752]: I0227 17:40:14.395851 4752 scope.go:117] "RemoveContainer" containerID="43f0cfa84a9cb4890e29806b8a305a7651dfa7ddb553f2bf9ca7381a8339fe36" Feb 27 17:40:14 crc kubenswrapper[4752]: I0227 17:40:14.419167 4752 scope.go:117] "RemoveContainer" containerID="5a18461110d98b7163f3f5f0ae364690e3afdd3c7ed957cf11680668c71d0435" Feb 27 17:40:14 crc kubenswrapper[4752]: E0227 17:40:14.420378 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a18461110d98b7163f3f5f0ae364690e3afdd3c7ed957cf11680668c71d0435\": container with ID starting with 5a18461110d98b7163f3f5f0ae364690e3afdd3c7ed957cf11680668c71d0435 not found: ID does not exist" containerID="5a18461110d98b7163f3f5f0ae364690e3afdd3c7ed957cf11680668c71d0435" Feb 27 17:40:14 crc kubenswrapper[4752]: I0227 17:40:14.420424 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a18461110d98b7163f3f5f0ae364690e3afdd3c7ed957cf11680668c71d0435"} err="failed to get container status \"5a18461110d98b7163f3f5f0ae364690e3afdd3c7ed957cf11680668c71d0435\": rpc error: code = NotFound desc = could not find container \"5a18461110d98b7163f3f5f0ae364690e3afdd3c7ed957cf11680668c71d0435\": container with ID starting with 5a18461110d98b7163f3f5f0ae364690e3afdd3c7ed957cf11680668c71d0435 not found: ID does not exist" Feb 27 17:40:14 crc kubenswrapper[4752]: I0227 17:40:14.420452 4752 scope.go:117] "RemoveContainer" containerID="a8046adc4ec96a80ce78629877f1a8269ff305beba9863bcdffa8d82c4961be1" Feb 27 17:40:14 crc kubenswrapper[4752]: E0227 17:40:14.420757 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8046adc4ec96a80ce78629877f1a8269ff305beba9863bcdffa8d82c4961be1\": container with ID starting with a8046adc4ec96a80ce78629877f1a8269ff305beba9863bcdffa8d82c4961be1 not found: ID does not exist" containerID="a8046adc4ec96a80ce78629877f1a8269ff305beba9863bcdffa8d82c4961be1" Feb 27 17:40:14 crc kubenswrapper[4752]: I0227 17:40:14.420795 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8046adc4ec96a80ce78629877f1a8269ff305beba9863bcdffa8d82c4961be1"} err="failed to get container status \"a8046adc4ec96a80ce78629877f1a8269ff305beba9863bcdffa8d82c4961be1\": rpc error: code = NotFound desc = could not find container \"a8046adc4ec96a80ce78629877f1a8269ff305beba9863bcdffa8d82c4961be1\": container with ID starting with a8046adc4ec96a80ce78629877f1a8269ff305beba9863bcdffa8d82c4961be1 not found: ID does not exist" Feb 27 17:40:14 crc kubenswrapper[4752]: I0227 17:40:14.420822 4752 scope.go:117] "RemoveContainer" containerID="43f0cfa84a9cb4890e29806b8a305a7651dfa7ddb553f2bf9ca7381a8339fe36" Feb 27 17:40:14 crc kubenswrapper[4752]: E0227 17:40:14.421262 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43f0cfa84a9cb4890e29806b8a305a7651dfa7ddb553f2bf9ca7381a8339fe36\": container with ID starting with 43f0cfa84a9cb4890e29806b8a305a7651dfa7ddb553f2bf9ca7381a8339fe36 not found: ID does not exist" containerID="43f0cfa84a9cb4890e29806b8a305a7651dfa7ddb553f2bf9ca7381a8339fe36" Feb 27 17:40:14 crc kubenswrapper[4752]: I0227 17:40:14.421343 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43f0cfa84a9cb4890e29806b8a305a7651dfa7ddb553f2bf9ca7381a8339fe36"} err="failed to get container status \"43f0cfa84a9cb4890e29806b8a305a7651dfa7ddb553f2bf9ca7381a8339fe36\": rpc error: code = NotFound desc = could not find container \"43f0cfa84a9cb4890e29806b8a305a7651dfa7ddb553f2bf9ca7381a8339fe36\": container with ID starting with 43f0cfa84a9cb4890e29806b8a305a7651dfa7ddb553f2bf9ca7381a8339fe36 not found: ID does not exist" Feb 27 17:40:14 crc kubenswrapper[4752]: I0227 17:40:14.910127 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cllc6" podUID="049892e0-329b-442b-b232-997ee454f9c6" containerName="registry-server" probeResult="failure" output=< Feb 27 17:40:14 crc kubenswrapper[4752]: timeout: failed to connect service ":50051" within 1s Feb 27 17:40:14 crc kubenswrapper[4752]: > Feb 27 17:40:14 crc kubenswrapper[4752]: I0227 17:40:14.930289 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2db85625-5324-4606-a2f1-740416e8d218" path="/var/lib/kubelet/pods/2db85625-5324-4606-a2f1-740416e8d218/volumes" Feb 27 17:40:16 crc kubenswrapper[4752]: E0227 17:40:16.198924 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 27 17:40:16 crc kubenswrapper[4752]: E0227 17:40:16.199460 4752 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 17:40:16 crc kubenswrapper[4752]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 27 17:40:16 crc kubenswrapper[4752]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pz6hg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29536900-5b8dn_openshift-infra(4c7d2d1c-023b-43e1-9015-5b572f4648cf): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error) Feb 27 17:40:16 crc kubenswrapper[4752]: > logger="UnhandledError" Feb 27 17:40:16 crc kubenswrapper[4752]: E0227 17:40:16.200680 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)\"" pod="openshift-infra/auto-csr-approver-29536900-5b8dn" podUID="4c7d2d1c-023b-43e1-9015-5b572f4648cf" Feb 27 17:40:16 crc kubenswrapper[4752]: E0227 17:40:16.909698 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-zhhxr" podUID="760298d8-7405-4c9e-b322-b08dbc182da8" Feb 27 17:40:16 crc kubenswrapper[4752]: E0227 17:40:16.914509 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-qvj4t" podUID="137184c7-4f82-4685-89fa-d5152358e216" Feb 27 17:40:17 crc kubenswrapper[4752]: E0227 17:40:17.909680 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-6kwhk" podUID="78323811-0abf-4cc6-921c-5d0e56e895a3" Feb 27 17:40:19 crc kubenswrapper[4752]: E0227 17:40:19.909657 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-dm9bt" podUID="cad177e6-5ee1-4884-bb19-b9413b183acc" Feb 27 17:40:20 crc kubenswrapper[4752]: E0227 17:40:20.107419 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 27 17:40:20 crc kubenswrapper[4752]: E0227 17:40:20.107729 4752 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 17:40:20 crc kubenswrapper[4752]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 27 17:40:20 crc kubenswrapper[4752]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-r9djm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29536898-598km_openshift-infra(cc36acda-9447-479d-b741-c063ecb91f3e): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error) Feb 27 17:40:20 crc kubenswrapper[4752]: > logger="UnhandledError" Feb 27 17:40:20 crc kubenswrapper[4752]: E0227 17:40:20.108997 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)\"" pod="openshift-infra/auto-csr-approver-29536898-598km" podUID="cc36acda-9447-479d-b741-c063ecb91f3e" Feb 27 17:40:20 crc kubenswrapper[4752]: I0227 17:40:20.854613 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t8w4t" Feb 27 17:40:20 crc kubenswrapper[4752]: I0227 17:40:20.927385 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t8w4t" Feb 27 17:40:23 crc kubenswrapper[4752]: I0227 17:40:23.922702 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cllc6" Feb 27 17:40:23 crc kubenswrapper[4752]: I0227 17:40:23.979578 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cllc6" Feb 27 17:40:24 crc kubenswrapper[4752]: I0227 17:40:24.342849 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cllc6"] Feb 27 17:40:24 crc kubenswrapper[4752]: I0227 17:40:24.562541 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8r7pq"] Feb 27 17:40:25 crc kubenswrapper[4752]: I0227 17:40:25.434245 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cllc6" podUID="049892e0-329b-442b-b232-997ee454f9c6" containerName="registry-server" containerID="cri-o://0fc93b53f02a726982f59c05dd78edcc2b73d771891e7dc9366413374a0da2b8" gracePeriod=2 Feb 27 17:40:26 crc kubenswrapper[4752]: E0227 17:40:26.910088 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-b7x2z" podUID="899d1101-b4de-4326-b442-6450903b2a30" Feb 27 17:40:27 crc kubenswrapper[4752]: I0227 17:40:27.452541 4752 generic.go:334] "Generic (PLEG): container finished" podID="049892e0-329b-442b-b232-997ee454f9c6" containerID="0fc93b53f02a726982f59c05dd78edcc2b73d771891e7dc9366413374a0da2b8" exitCode=0 Feb 27 17:40:27 crc kubenswrapper[4752]: I0227 17:40:27.452892 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cllc6" event={"ID":"049892e0-329b-442b-b232-997ee454f9c6","Type":"ContainerDied","Data":"0fc93b53f02a726982f59c05dd78edcc2b73d771891e7dc9366413374a0da2b8"} Feb 27 17:40:27 crc kubenswrapper[4752]: I0227 17:40:27.557231 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cllc6" Feb 27 17:40:27 crc kubenswrapper[4752]: I0227 17:40:27.618365 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/049892e0-329b-442b-b232-997ee454f9c6-utilities\") pod \"049892e0-329b-442b-b232-997ee454f9c6\" (UID: \"049892e0-329b-442b-b232-997ee454f9c6\") " Feb 27 17:40:27 crc kubenswrapper[4752]: I0227 17:40:27.618439 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxkqq\" (UniqueName: \"kubernetes.io/projected/049892e0-329b-442b-b232-997ee454f9c6-kube-api-access-dxkqq\") pod \"049892e0-329b-442b-b232-997ee454f9c6\" (UID: \"049892e0-329b-442b-b232-997ee454f9c6\") " Feb 27 17:40:27 crc kubenswrapper[4752]: I0227 17:40:27.618485 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/049892e0-329b-442b-b232-997ee454f9c6-catalog-content\") pod \"049892e0-329b-442b-b232-997ee454f9c6\" (UID: \"049892e0-329b-442b-b232-997ee454f9c6\") " Feb 27 17:40:27 crc kubenswrapper[4752]: I0227 17:40:27.619332 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/049892e0-329b-442b-b232-997ee454f9c6-utilities" (OuterVolumeSpecName: "utilities") pod "049892e0-329b-442b-b232-997ee454f9c6" (UID: "049892e0-329b-442b-b232-997ee454f9c6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 17:40:27 crc kubenswrapper[4752]: I0227 17:40:27.625748 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/049892e0-329b-442b-b232-997ee454f9c6-kube-api-access-dxkqq" (OuterVolumeSpecName: "kube-api-access-dxkqq") pod "049892e0-329b-442b-b232-997ee454f9c6" (UID: "049892e0-329b-442b-b232-997ee454f9c6"). InnerVolumeSpecName "kube-api-access-dxkqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:40:27 crc kubenswrapper[4752]: I0227 17:40:27.719710 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxkqq\" (UniqueName: \"kubernetes.io/projected/049892e0-329b-442b-b232-997ee454f9c6-kube-api-access-dxkqq\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:27 crc kubenswrapper[4752]: I0227 17:40:27.719762 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/049892e0-329b-442b-b232-997ee454f9c6-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:27 crc kubenswrapper[4752]: I0227 17:40:27.779879 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/049892e0-329b-442b-b232-997ee454f9c6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "049892e0-329b-442b-b232-997ee454f9c6" (UID: "049892e0-329b-442b-b232-997ee454f9c6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 17:40:27 crc kubenswrapper[4752]: I0227 17:40:27.820925 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/049892e0-329b-442b-b232-997ee454f9c6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:28 crc kubenswrapper[4752]: I0227 17:40:28.454680 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6d9c88f797-h9v2p"] Feb 27 17:40:28 crc kubenswrapper[4752]: I0227 17:40:28.455085 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6d9c88f797-h9v2p" podUID="630c3496-e651-4923-869a-85ab326b2a36" containerName="controller-manager" containerID="cri-o://2beb1f23929946c6847396243f4e3bab90033a98d8842a35cd0d23a9dffbc45b" gracePeriod=30 Feb 27 17:40:28 crc kubenswrapper[4752]: I0227 17:40:28.475010 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cllc6" event={"ID":"049892e0-329b-442b-b232-997ee454f9c6","Type":"ContainerDied","Data":"f7869c4384e6a4b6cc9d05c573dccc56b8d062db6eab32dbf51fc5c3a3d70019"} Feb 27 17:40:28 crc kubenswrapper[4752]: I0227 17:40:28.475088 4752 scope.go:117] "RemoveContainer" containerID="0fc93b53f02a726982f59c05dd78edcc2b73d771891e7dc9366413374a0da2b8" Feb 27 17:40:28 crc kubenswrapper[4752]: I0227 17:40:28.475319 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cllc6" Feb 27 17:40:28 crc kubenswrapper[4752]: I0227 17:40:28.496682 4752 scope.go:117] "RemoveContainer" containerID="01da1ce333dfd3dcda701b9059d039a737a4049c29307747c4b9cd58800cf038" Feb 27 17:40:28 crc kubenswrapper[4752]: I0227 17:40:28.537301 4752 scope.go:117] "RemoveContainer" containerID="0e9fd2e747fc409eb6f57eaaeda905137a4f650ca891f9ce18922e79bc2e2b23" Feb 27 17:40:28 crc kubenswrapper[4752]: I0227 17:40:28.537415 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cllc6"] Feb 27 17:40:28 crc kubenswrapper[4752]: I0227 17:40:28.540938 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cllc6"] Feb 27 17:40:28 crc kubenswrapper[4752]: I0227 17:40:28.571409 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5589d447db-8zx2p"] Feb 27 17:40:28 crc kubenswrapper[4752]: I0227 17:40:28.571613 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5589d447db-8zx2p" podUID="9534fc15-28d5-4739-a207-0b657411460b" containerName="route-controller-manager" containerID="cri-o://06d4e84ec3a4dbef2d1bbf568fa725299c7a31b95d0052c693ec7ecd2e747b59" gracePeriod=30 Feb 27 17:40:28 crc kubenswrapper[4752]: I0227 17:40:28.914378 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="049892e0-329b-442b-b232-997ee454f9c6" path="/var/lib/kubelet/pods/049892e0-329b-442b-b232-997ee454f9c6/volumes" Feb 27 17:40:29 crc kubenswrapper[4752]: I0227 17:40:29.485445 4752 generic.go:334] "Generic (PLEG): container finished" podID="630c3496-e651-4923-869a-85ab326b2a36" containerID="2beb1f23929946c6847396243f4e3bab90033a98d8842a35cd0d23a9dffbc45b" exitCode=0 Feb 27 17:40:29 crc kubenswrapper[4752]: I0227 17:40:29.485574 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d9c88f797-h9v2p" event={"ID":"630c3496-e651-4923-869a-85ab326b2a36","Type":"ContainerDied","Data":"2beb1f23929946c6847396243f4e3bab90033a98d8842a35cd0d23a9dffbc45b"} Feb 27 17:40:29 crc kubenswrapper[4752]: E0227 17:40:29.815308 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error)" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 27 17:40:29 crc kubenswrapper[4752]: E0227 17:40:29.815637 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-92dcf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-6kwhk_openshift-marketplace(78323811-0abf-4cc6-921c-5d0e56e895a3): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 17:40:29 crc kubenswrapper[4752]: E0227 17:40:29.816951 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error)\"" pod="openshift-marketplace/redhat-marketplace-6kwhk" podUID="78323811-0abf-4cc6-921c-5d0e56e895a3" Feb 27 17:40:29 crc kubenswrapper[4752]: I0227 17:40:29.895449 4752 patch_prober.go:28] interesting pod/route-controller-manager-5589d447db-8zx2p container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" start-of-body= Feb 27 17:40:29 crc kubenswrapper[4752]: I0227 17:40:29.896100 4752 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5589d447db-8zx2p" podUID="9534fc15-28d5-4739-a207-0b657411460b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" Feb 27 17:40:29 crc kubenswrapper[4752]: E0227 17:40:29.909391 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536900-5b8dn" podUID="4c7d2d1c-023b-43e1-9015-5b572f4648cf" Feb 27 17:40:30 crc kubenswrapper[4752]: I0227 17:40:30.496236 4752 generic.go:334] "Generic (PLEG): container finished" podID="9534fc15-28d5-4739-a207-0b657411460b" containerID="06d4e84ec3a4dbef2d1bbf568fa725299c7a31b95d0052c693ec7ecd2e747b59" exitCode=0 Feb 27 17:40:30 crc kubenswrapper[4752]: I0227 17:40:30.496281 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5589d447db-8zx2p" event={"ID":"9534fc15-28d5-4739-a207-0b657411460b","Type":"ContainerDied","Data":"06d4e84ec3a4dbef2d1bbf568fa725299c7a31b95d0052c693ec7ecd2e747b59"} Feb 27 17:40:30 crc kubenswrapper[4752]: I0227 17:40:30.838580 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5589d447db-8zx2p" Feb 27 17:40:30 crc kubenswrapper[4752]: I0227 17:40:30.869347 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9534fc15-28d5-4739-a207-0b657411460b-config\") pod \"9534fc15-28d5-4739-a207-0b657411460b\" (UID: \"9534fc15-28d5-4739-a207-0b657411460b\") " Feb 27 17:40:30 crc kubenswrapper[4752]: I0227 17:40:30.869759 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9534fc15-28d5-4739-a207-0b657411460b-client-ca\") pod \"9534fc15-28d5-4739-a207-0b657411460b\" (UID: \"9534fc15-28d5-4739-a207-0b657411460b\") " Feb 27 17:40:30 crc kubenswrapper[4752]: I0227 17:40:30.869855 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9534fc15-28d5-4739-a207-0b657411460b-serving-cert\") pod \"9534fc15-28d5-4739-a207-0b657411460b\" (UID: \"9534fc15-28d5-4739-a207-0b657411460b\") " Feb 27 17:40:30 crc kubenswrapper[4752]: I0227 17:40:30.869927 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jzc7\" (UniqueName: \"kubernetes.io/projected/9534fc15-28d5-4739-a207-0b657411460b-kube-api-access-8jzc7\") pod \"9534fc15-28d5-4739-a207-0b657411460b\" (UID: \"9534fc15-28d5-4739-a207-0b657411460b\") " Feb 27 17:40:30 crc kubenswrapper[4752]: I0227 17:40:30.872190 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9534fc15-28d5-4739-a207-0b657411460b-client-ca" (OuterVolumeSpecName: "client-ca") pod "9534fc15-28d5-4739-a207-0b657411460b" (UID: "9534fc15-28d5-4739-a207-0b657411460b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:40:30 crc kubenswrapper[4752]: I0227 17:40:30.872023 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9534fc15-28d5-4739-a207-0b657411460b-config" (OuterVolumeSpecName: "config") pod "9534fc15-28d5-4739-a207-0b657411460b" (UID: "9534fc15-28d5-4739-a207-0b657411460b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:40:30 crc kubenswrapper[4752]: I0227 17:40:30.879858 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9534fc15-28d5-4739-a207-0b657411460b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9534fc15-28d5-4739-a207-0b657411460b" (UID: "9534fc15-28d5-4739-a207-0b657411460b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:40:30 crc kubenswrapper[4752]: I0227 17:40:30.879951 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9534fc15-28d5-4739-a207-0b657411460b-kube-api-access-8jzc7" (OuterVolumeSpecName: "kube-api-access-8jzc7") pod "9534fc15-28d5-4739-a207-0b657411460b" (UID: "9534fc15-28d5-4739-a207-0b657411460b"). InnerVolumeSpecName "kube-api-access-8jzc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:40:30 crc kubenswrapper[4752]: I0227 17:40:30.883039 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-764d847c9b-pcf5d"] Feb 27 17:40:30 crc kubenswrapper[4752]: E0227 17:40:30.883495 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="049892e0-329b-442b-b232-997ee454f9c6" containerName="registry-server" Feb 27 17:40:30 crc kubenswrapper[4752]: I0227 17:40:30.883526 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="049892e0-329b-442b-b232-997ee454f9c6" containerName="registry-server" Feb 27 17:40:30 crc kubenswrapper[4752]: E0227 17:40:30.883541 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2db85625-5324-4606-a2f1-740416e8d218" containerName="extract-utilities" Feb 27 17:40:30 crc kubenswrapper[4752]: I0227 17:40:30.883554 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2db85625-5324-4606-a2f1-740416e8d218" containerName="extract-utilities" Feb 27 17:40:30 crc kubenswrapper[4752]: E0227 17:40:30.883581 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="049892e0-329b-442b-b232-997ee454f9c6" containerName="extract-content" Feb 27 17:40:30 crc kubenswrapper[4752]: I0227 17:40:30.883596 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="049892e0-329b-442b-b232-997ee454f9c6" containerName="extract-content" Feb 27 17:40:30 crc kubenswrapper[4752]: E0227 17:40:30.883618 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="049892e0-329b-442b-b232-997ee454f9c6" containerName="extract-utilities" Feb 27 17:40:30 crc kubenswrapper[4752]: I0227 17:40:30.883631 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="049892e0-329b-442b-b232-997ee454f9c6" containerName="extract-utilities" Feb 27 17:40:30 crc kubenswrapper[4752]: E0227 17:40:30.883654 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2db85625-5324-4606-a2f1-740416e8d218" containerName="registry-server" Feb 27 17:40:30 crc kubenswrapper[4752]: I0227 17:40:30.883665 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2db85625-5324-4606-a2f1-740416e8d218" containerName="registry-server" Feb 27 17:40:30 crc kubenswrapper[4752]: E0227 17:40:30.883682 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9534fc15-28d5-4739-a207-0b657411460b" containerName="route-controller-manager" Feb 27 17:40:30 crc kubenswrapper[4752]: I0227 17:40:30.883695 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="9534fc15-28d5-4739-a207-0b657411460b" containerName="route-controller-manager" Feb 27 17:40:30 crc kubenswrapper[4752]: E0227 17:40:30.883721 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2db85625-5324-4606-a2f1-740416e8d218" containerName="extract-content" Feb 27 17:40:30 crc kubenswrapper[4752]: I0227 17:40:30.883733 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2db85625-5324-4606-a2f1-740416e8d218" containerName="extract-content" Feb 27 17:40:30 crc kubenswrapper[4752]: I0227 17:40:30.883973 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="9534fc15-28d5-4739-a207-0b657411460b" containerName="route-controller-manager" Feb 27 17:40:30 crc kubenswrapper[4752]: I0227 17:40:30.884001 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="049892e0-329b-442b-b232-997ee454f9c6" containerName="registry-server" Feb 27 17:40:30 crc kubenswrapper[4752]: I0227 17:40:30.884025 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="2db85625-5324-4606-a2f1-740416e8d218" containerName="registry-server" Feb 27 17:40:30 crc kubenswrapper[4752]: I0227 17:40:30.884696 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-764d847c9b-pcf5d" Feb 27 17:40:30 crc kubenswrapper[4752]: I0227 17:40:30.887681 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-764d847c9b-pcf5d"] Feb 27 17:40:30 crc kubenswrapper[4752]: I0227 17:40:30.916535 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d9c88f797-h9v2p" Feb 27 17:40:30 crc kubenswrapper[4752]: I0227 17:40:30.971177 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/570355a7-01e2-4064-ae41-f43709dc4b88-client-ca\") pod \"route-controller-manager-764d847c9b-pcf5d\" (UID: \"570355a7-01e2-4064-ae41-f43709dc4b88\") " pod="openshift-route-controller-manager/route-controller-manager-764d847c9b-pcf5d" Feb 27 17:40:30 crc kubenswrapper[4752]: I0227 17:40:30.971225 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/570355a7-01e2-4064-ae41-f43709dc4b88-config\") pod \"route-controller-manager-764d847c9b-pcf5d\" (UID: \"570355a7-01e2-4064-ae41-f43709dc4b88\") " pod="openshift-route-controller-manager/route-controller-manager-764d847c9b-pcf5d" Feb 27 17:40:30 crc kubenswrapper[4752]: I0227 17:40:30.971356 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gstbp\" (UniqueName: \"kubernetes.io/projected/570355a7-01e2-4064-ae41-f43709dc4b88-kube-api-access-gstbp\") pod \"route-controller-manager-764d847c9b-pcf5d\" (UID: \"570355a7-01e2-4064-ae41-f43709dc4b88\") " pod="openshift-route-controller-manager/route-controller-manager-764d847c9b-pcf5d" Feb 27 17:40:30 crc kubenswrapper[4752]: I0227 17:40:30.971400 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/570355a7-01e2-4064-ae41-f43709dc4b88-serving-cert\") pod \"route-controller-manager-764d847c9b-pcf5d\" (UID: \"570355a7-01e2-4064-ae41-f43709dc4b88\") " pod="openshift-route-controller-manager/route-controller-manager-764d847c9b-pcf5d" Feb 27 17:40:30 crc kubenswrapper[4752]: I0227 17:40:30.971457 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jzc7\" (UniqueName: \"kubernetes.io/projected/9534fc15-28d5-4739-a207-0b657411460b-kube-api-access-8jzc7\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:30 crc kubenswrapper[4752]: I0227 17:40:30.971470 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9534fc15-28d5-4739-a207-0b657411460b-config\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:30 crc kubenswrapper[4752]: I0227 17:40:30.971479 4752 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9534fc15-28d5-4739-a207-0b657411460b-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:30 crc kubenswrapper[4752]: I0227 17:40:30.971532 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9534fc15-28d5-4739-a207-0b657411460b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:31 crc kubenswrapper[4752]: I0227 17:40:31.072696 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/630c3496-e651-4923-869a-85ab326b2a36-client-ca\") pod \"630c3496-e651-4923-869a-85ab326b2a36\" (UID: \"630c3496-e651-4923-869a-85ab326b2a36\") " Feb 27 17:40:31 crc kubenswrapper[4752]: I0227 17:40:31.072818 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/630c3496-e651-4923-869a-85ab326b2a36-serving-cert\") pod \"630c3496-e651-4923-869a-85ab326b2a36\" (UID: \"630c3496-e651-4923-869a-85ab326b2a36\") " Feb 27 17:40:31 crc kubenswrapper[4752]: I0227 17:40:31.072887 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/630c3496-e651-4923-869a-85ab326b2a36-config\") pod \"630c3496-e651-4923-869a-85ab326b2a36\" (UID: \"630c3496-e651-4923-869a-85ab326b2a36\") " Feb 27 17:40:31 crc kubenswrapper[4752]: I0227 17:40:31.072916 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/630c3496-e651-4923-869a-85ab326b2a36-proxy-ca-bundles\") pod \"630c3496-e651-4923-869a-85ab326b2a36\" (UID: \"630c3496-e651-4923-869a-85ab326b2a36\") " Feb 27 17:40:31 crc kubenswrapper[4752]: I0227 17:40:31.072969 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xk58k\" (UniqueName: \"kubernetes.io/projected/630c3496-e651-4923-869a-85ab326b2a36-kube-api-access-xk58k\") pod \"630c3496-e651-4923-869a-85ab326b2a36\" (UID: \"630c3496-e651-4923-869a-85ab326b2a36\") " Feb 27 17:40:31 crc kubenswrapper[4752]: I0227 17:40:31.073211 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gstbp\" (UniqueName: \"kubernetes.io/projected/570355a7-01e2-4064-ae41-f43709dc4b88-kube-api-access-gstbp\") pod \"route-controller-manager-764d847c9b-pcf5d\" (UID: \"570355a7-01e2-4064-ae41-f43709dc4b88\") " pod="openshift-route-controller-manager/route-controller-manager-764d847c9b-pcf5d" Feb 27 17:40:31 crc kubenswrapper[4752]: I0227 17:40:31.073246 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/570355a7-01e2-4064-ae41-f43709dc4b88-serving-cert\") pod \"route-controller-manager-764d847c9b-pcf5d\" (UID: \"570355a7-01e2-4064-ae41-f43709dc4b88\") " pod="openshift-route-controller-manager/route-controller-manager-764d847c9b-pcf5d" Feb 27 17:40:31 crc kubenswrapper[4752]: I0227 17:40:31.073319 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/570355a7-01e2-4064-ae41-f43709dc4b88-client-ca\") pod \"route-controller-manager-764d847c9b-pcf5d\" (UID: \"570355a7-01e2-4064-ae41-f43709dc4b88\") " pod="openshift-route-controller-manager/route-controller-manager-764d847c9b-pcf5d" Feb 27 17:40:31 crc kubenswrapper[4752]: I0227 17:40:31.073361 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/570355a7-01e2-4064-ae41-f43709dc4b88-config\") pod \"route-controller-manager-764d847c9b-pcf5d\" (UID: \"570355a7-01e2-4064-ae41-f43709dc4b88\") " pod="openshift-route-controller-manager/route-controller-manager-764d847c9b-pcf5d" Feb 27 17:40:31 crc kubenswrapper[4752]: I0227 17:40:31.074182 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/630c3496-e651-4923-869a-85ab326b2a36-config" (OuterVolumeSpecName: "config") pod "630c3496-e651-4923-869a-85ab326b2a36" (UID: "630c3496-e651-4923-869a-85ab326b2a36"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:40:31 crc kubenswrapper[4752]: I0227 17:40:31.074855 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/570355a7-01e2-4064-ae41-f43709dc4b88-config\") pod \"route-controller-manager-764d847c9b-pcf5d\" (UID: \"570355a7-01e2-4064-ae41-f43709dc4b88\") " pod="openshift-route-controller-manager/route-controller-manager-764d847c9b-pcf5d" Feb 27 17:40:31 crc kubenswrapper[4752]: I0227 17:40:31.076646 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/630c3496-e651-4923-869a-85ab326b2a36-kube-api-access-xk58k" (OuterVolumeSpecName: "kube-api-access-xk58k") pod "630c3496-e651-4923-869a-85ab326b2a36" (UID: "630c3496-e651-4923-869a-85ab326b2a36"). InnerVolumeSpecName "kube-api-access-xk58k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:40:31 crc kubenswrapper[4752]: I0227 17:40:31.076727 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/630c3496-e651-4923-869a-85ab326b2a36-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "630c3496-e651-4923-869a-85ab326b2a36" (UID: "630c3496-e651-4923-869a-85ab326b2a36"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:40:31 crc kubenswrapper[4752]: I0227 17:40:31.078429 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/570355a7-01e2-4064-ae41-f43709dc4b88-client-ca\") pod \"route-controller-manager-764d847c9b-pcf5d\" (UID: \"570355a7-01e2-4064-ae41-f43709dc4b88\") " pod="openshift-route-controller-manager/route-controller-manager-764d847c9b-pcf5d" Feb 27 17:40:31 crc kubenswrapper[4752]: I0227 17:40:31.078487 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/630c3496-e651-4923-869a-85ab326b2a36-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "630c3496-e651-4923-869a-85ab326b2a36" (UID: "630c3496-e651-4923-869a-85ab326b2a36"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:40:31 crc kubenswrapper[4752]: I0227 17:40:31.078740 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/630c3496-e651-4923-869a-85ab326b2a36-client-ca" (OuterVolumeSpecName: "client-ca") pod "630c3496-e651-4923-869a-85ab326b2a36" (UID: "630c3496-e651-4923-869a-85ab326b2a36"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:40:31 crc kubenswrapper[4752]: I0227 17:40:31.082346 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/570355a7-01e2-4064-ae41-f43709dc4b88-serving-cert\") pod \"route-controller-manager-764d847c9b-pcf5d\" (UID: \"570355a7-01e2-4064-ae41-f43709dc4b88\") " pod="openshift-route-controller-manager/route-controller-manager-764d847c9b-pcf5d" Feb 27 17:40:31 crc kubenswrapper[4752]: I0227 17:40:31.094404 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gstbp\" (UniqueName: \"kubernetes.io/projected/570355a7-01e2-4064-ae41-f43709dc4b88-kube-api-access-gstbp\") pod \"route-controller-manager-764d847c9b-pcf5d\" (UID: \"570355a7-01e2-4064-ae41-f43709dc4b88\") " pod="openshift-route-controller-manager/route-controller-manager-764d847c9b-pcf5d" Feb 27 17:40:31 crc kubenswrapper[4752]: I0227 17:40:31.174202 4752 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/630c3496-e651-4923-869a-85ab326b2a36-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:31 crc kubenswrapper[4752]: I0227 17:40:31.174231 4752 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/630c3496-e651-4923-869a-85ab326b2a36-config\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:31 crc kubenswrapper[4752]: I0227 17:40:31.174246 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xk58k\" (UniqueName: \"kubernetes.io/projected/630c3496-e651-4923-869a-85ab326b2a36-kube-api-access-xk58k\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:31 crc kubenswrapper[4752]: I0227 17:40:31.174260 4752 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/630c3496-e651-4923-869a-85ab326b2a36-client-ca\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:31 crc kubenswrapper[4752]: I0227 17:40:31.174271 4752 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/630c3496-e651-4923-869a-85ab326b2a36-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:31 crc kubenswrapper[4752]: I0227 17:40:31.225480 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-764d847c9b-pcf5d" Feb 27 17:40:31 crc kubenswrapper[4752]: I0227 17:40:31.504868 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5589d447db-8zx2p" event={"ID":"9534fc15-28d5-4739-a207-0b657411460b","Type":"ContainerDied","Data":"0d0adfe6fc095f25c983e92c9bb2e947e5b491a9a67f3f47a7061ee31e35b62a"} Feb 27 17:40:31 crc kubenswrapper[4752]: I0227 17:40:31.504926 4752 scope.go:117] "RemoveContainer" containerID="06d4e84ec3a4dbef2d1bbf568fa725299c7a31b95d0052c693ec7ecd2e747b59" Feb 27 17:40:31 crc kubenswrapper[4752]: I0227 17:40:31.504921 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5589d447db-8zx2p" Feb 27 17:40:31 crc kubenswrapper[4752]: I0227 17:40:31.510594 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d9c88f797-h9v2p" Feb 27 17:40:31 crc kubenswrapper[4752]: I0227 17:40:31.510622 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d9c88f797-h9v2p" event={"ID":"630c3496-e651-4923-869a-85ab326b2a36","Type":"ContainerDied","Data":"8ae7b1f53fef38dc7581c346ee84d8c69fe044d89bdc7c93dfcad2c989730ca4"} Feb 27 17:40:31 crc kubenswrapper[4752]: I0227 17:40:31.513085 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qvj4t" event={"ID":"137184c7-4f82-4685-89fa-d5152358e216","Type":"ContainerStarted","Data":"9b2a66ff1a9bfe5541ad6a86192ff7e124123487a2168bb9bdee799ca9a092a6"} Feb 27 17:40:31 crc kubenswrapper[4752]: I0227 17:40:31.528746 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5589d447db-8zx2p"] Feb 27 17:40:31 crc kubenswrapper[4752]: I0227 17:40:31.532671 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5589d447db-8zx2p"] Feb 27 17:40:31 crc kubenswrapper[4752]: I0227 17:40:31.543272 4752 scope.go:117] "RemoveContainer" containerID="2beb1f23929946c6847396243f4e3bab90033a98d8842a35cd0d23a9dffbc45b" Feb 27 17:40:31 crc kubenswrapper[4752]: I0227 17:40:31.571956 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6d9c88f797-h9v2p"] Feb 27 17:40:31 crc kubenswrapper[4752]: I0227 17:40:31.575178 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6d9c88f797-h9v2p"] Feb 27 17:40:31 crc kubenswrapper[4752]: I0227 17:40:31.724129 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-764d847c9b-pcf5d"] Feb 27 17:40:31 crc kubenswrapper[4752]: W0227 17:40:31.732544 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod570355a7_01e2_4064_ae41_f43709dc4b88.slice/crio-52bbb2ca2a047a97532741d7ed1c67b1581865d8dccf53b51dce1ccc5efcae96 WatchSource:0}: Error finding container 52bbb2ca2a047a97532741d7ed1c67b1581865d8dccf53b51dce1ccc5efcae96: Status 404 returned error can't find the container with id 52bbb2ca2a047a97532741d7ed1c67b1581865d8dccf53b51dce1ccc5efcae96 Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.323493 4752 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.324470 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://ba92a9228c8523b28a4a03f1f859339e21eac77405f37f5206e37ce4ee45d381" gracePeriod=15 Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.324438 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://ad318061566db9fcc86af89349dee0916a2f6344e6e374e356f5929c30b1150d" gracePeriod=15 Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.324651 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://90c1fe8d41bdc04d8780aca50265c0e6cf43abbb1cda58aae8f78d54d7c52d2f" gracePeriod=15 Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.324694 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://f9dd98b133cbae7c1dfa12a34dd6973dd4a5fc4340d34c1c4b3296d036456a82" gracePeriod=15 Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.324974 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://269b9e60d7588f2345dfd731d886ef405b3ccb49e23ae2e676a3fa354a0c9d70" gracePeriod=15 Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.325759 4752 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 27 17:40:32 crc kubenswrapper[4752]: E0227 17:40:32.326132 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.326190 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 27 17:40:32 crc kubenswrapper[4752]: E0227 17:40:32.326217 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.326233 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 27 17:40:32 crc kubenswrapper[4752]: E0227 17:40:32.326256 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.326271 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 17:40:32 crc kubenswrapper[4752]: E0227 17:40:32.326297 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.326312 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 27 17:40:32 crc kubenswrapper[4752]: E0227 17:40:32.326330 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.326345 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 17:40:32 crc kubenswrapper[4752]: E0227 17:40:32.326366 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.326380 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 17:40:32 crc kubenswrapper[4752]: E0227 17:40:32.326399 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.326414 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 27 17:40:32 crc kubenswrapper[4752]: E0227 17:40:32.326432 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.326446 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 17:40:32 crc kubenswrapper[4752]: E0227 17:40:32.326460 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="630c3496-e651-4923-869a-85ab326b2a36" containerName="controller-manager" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.326475 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="630c3496-e651-4923-869a-85ab326b2a36" containerName="controller-manager" Feb 27 17:40:32 crc kubenswrapper[4752]: E0227 17:40:32.326492 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.326506 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 17:40:32 crc kubenswrapper[4752]: E0227 17:40:32.326532 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.326550 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.326781 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.326806 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.326822 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.326841 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="630c3496-e651-4923-869a-85ab326b2a36" containerName="controller-manager" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.326858 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.326885 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.326904 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.326925 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.326949 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.326968 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 27 17:40:32 crc kubenswrapper[4752]: E0227 17:40:32.327256 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.327282 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.327651 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.389059 4752 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.389892 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.395739 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.395865 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.395956 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.396033 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.396080 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.396122 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.396269 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.396342 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 17:40:32 crc kubenswrapper[4752]: E0227 17:40:32.495305 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error)" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 27 17:40:32 crc kubenswrapper[4752]: E0227 17:40:32.495450 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-24ds4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-zhhxr_openshift-marketplace(760298d8-7405-4c9e-b322-b08dbc182da8): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 17:40:32 crc kubenswrapper[4752]: E0227 17:40:32.496398 4752 event.go:368] "Unable to write event (may retry after sleeping)" err="Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events/redhat-marketplace-zhhxr.18982b3fbd58ba3d\": dial tcp 38.102.83.102:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-marketplace-zhhxr.18982b3fbd58ba3d openshift-marketplace 29640 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-marketplace-zhhxr,UID:760298d8-7405-4c9e-b322-b08dbc182da8,APIVersion:v1,ResourceVersion:28444,FieldPath:spec.initContainers{extract-content},},Reason:Failed,Message:Failed to pull image \"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\": copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:40:06 +0000 UTC,LastTimestamp:2026-02-27 17:40:32.495368902 +0000 UTC m=+332.402185753,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:40:32 crc kubenswrapper[4752]: E0227 17:40:32.496578 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error)\"" pod="openshift-marketplace/redhat-marketplace-zhhxr" podUID="760298d8-7405-4c9e-b322-b08dbc182da8" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.496813 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.496851 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.496870 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.496886 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.496902 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.496918 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.496946 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.496961 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.497059 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.497091 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.497111 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.497131 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.497174 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.497196 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.497215 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.497233 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.520401 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.522637 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.523612 4752 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ad318061566db9fcc86af89349dee0916a2f6344e6e374e356f5929c30b1150d" exitCode=0 Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.523719 4752 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="90c1fe8d41bdc04d8780aca50265c0e6cf43abbb1cda58aae8f78d54d7c52d2f" exitCode=0 Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.523793 4752 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ba92a9228c8523b28a4a03f1f859339e21eac77405f37f5206e37ce4ee45d381" exitCode=0 Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.523672 4752 scope.go:117] "RemoveContainer" containerID="847623f81ea67c41616ce8b75151b8918e80c2ee2f82b6fe07bcd0ab5ce1add0" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.523872 4752 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f9dd98b133cbae7c1dfa12a34dd6973dd4a5fc4340d34c1c4b3296d036456a82" exitCode=2 Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.526360 4752 generic.go:334] "Generic (PLEG): container finished" podID="137184c7-4f82-4685-89fa-d5152358e216" containerID="9b2a66ff1a9bfe5541ad6a86192ff7e124123487a2168bb9bdee799ca9a092a6" exitCode=0 Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.526432 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qvj4t" event={"ID":"137184c7-4f82-4685-89fa-d5152358e216","Type":"ContainerDied","Data":"9b2a66ff1a9bfe5541ad6a86192ff7e124123487a2168bb9bdee799ca9a092a6"} Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.530583 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-764d847c9b-pcf5d" event={"ID":"570355a7-01e2-4064-ae41-f43709dc4b88","Type":"ContainerStarted","Data":"fb08e4874bc8902a732f4e679b99c8c8267bbc9e6c25cc11efa5335754f7ed88"} Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.530633 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-764d847c9b-pcf5d" event={"ID":"570355a7-01e2-4064-ae41-f43709dc4b88","Type":"ContainerStarted","Data":"52bbb2ca2a047a97532741d7ed1c67b1581865d8dccf53b51dce1ccc5efcae96"} Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.531816 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-764d847c9b-pcf5d" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.538216 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-764d847c9b-pcf5d" Feb 27 17:40:32 crc kubenswrapper[4752]: E0227 17:40:32.580231 4752 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:32 crc kubenswrapper[4752]: E0227 17:40:32.581238 4752 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:32 crc kubenswrapper[4752]: E0227 17:40:32.581954 4752 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:32 crc kubenswrapper[4752]: E0227 17:40:32.582406 4752 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:32 crc kubenswrapper[4752]: E0227 17:40:32.582881 4752 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.583108 4752 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 27 17:40:32 crc kubenswrapper[4752]: E0227 17:40:32.583557 4752 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.102:6443: connect: connection refused" interval="200ms" Feb 27 17:40:32 crc kubenswrapper[4752]: E0227 17:40:32.784592 4752 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.102:6443: connect: connection refused" interval="400ms" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.922136 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="630c3496-e651-4923-869a-85ab326b2a36" path="/var/lib/kubelet/pods/630c3496-e651-4923-869a-85ab326b2a36/volumes" Feb 27 17:40:32 crc kubenswrapper[4752]: I0227 17:40:32.923537 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9534fc15-28d5-4739-a207-0b657411460b" path="/var/lib/kubelet/pods/9534fc15-28d5-4739-a207-0b657411460b/volumes" Feb 27 17:40:33 crc kubenswrapper[4752]: E0227 17:40:33.186426 4752 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.102:6443: connect: connection refused" interval="800ms" Feb 27 17:40:33 crc kubenswrapper[4752]: I0227 17:40:33.544430 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 27 17:40:33 crc kubenswrapper[4752]: I0227 17:40:33.548049 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qvj4t" event={"ID":"137184c7-4f82-4685-89fa-d5152358e216","Type":"ContainerStarted","Data":"7723bf417f6f623b8becc93eec0cdf5ee3c260aab62b35c502e303e6b0624320"} Feb 27 17:40:33 crc kubenswrapper[4752]: I0227 17:40:33.551334 4752 generic.go:334] "Generic (PLEG): container finished" podID="1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba" containerID="53846f1a1a4722936fbb9592341b2aa3a7b6690e35684dee2c4d104d1aef013c" exitCode=0 Feb 27 17:40:33 crc kubenswrapper[4752]: I0227 17:40:33.551414 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba","Type":"ContainerDied","Data":"53846f1a1a4722936fbb9592341b2aa3a7b6690e35684dee2c4d104d1aef013c"} Feb 27 17:40:33 crc kubenswrapper[4752]: I0227 17:40:33.674266 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qvj4t" Feb 27 17:40:33 crc kubenswrapper[4752]: I0227 17:40:33.674461 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qvj4t" Feb 27 17:40:33 crc kubenswrapper[4752]: E0227 17:40:33.988345 4752 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.102:6443: connect: connection refused" interval="1.6s" Feb 27 17:40:34 crc kubenswrapper[4752]: I0227 17:40:34.737855 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qvj4t" podUID="137184c7-4f82-4685-89fa-d5152358e216" containerName="registry-server" probeResult="failure" output=< Feb 27 17:40:34 crc kubenswrapper[4752]: timeout: failed to connect service ":50051" within 1s Feb 27 17:40:34 crc kubenswrapper[4752]: > Feb 27 17:40:34 crc kubenswrapper[4752]: I0227 17:40:34.805558 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 27 17:40:34 crc kubenswrapper[4752]: I0227 17:40:34.806579 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:40:34 crc kubenswrapper[4752]: I0227 17:40:34.859348 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 27 17:40:34 crc kubenswrapper[4752]: E0227 17:40:34.907543 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536898-598km" podUID="cc36acda-9447-479d-b741-c063ecb91f3e" Feb 27 17:40:34 crc kubenswrapper[4752]: I0227 17:40:34.947890 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 27 17:40:34 crc kubenswrapper[4752]: I0227 17:40:34.947962 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 27 17:40:34 crc kubenswrapper[4752]: I0227 17:40:34.947998 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 17:40:34 crc kubenswrapper[4752]: I0227 17:40:34.948071 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 27 17:40:34 crc kubenswrapper[4752]: I0227 17:40:34.948110 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 17:40:34 crc kubenswrapper[4752]: I0227 17:40:34.948132 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 17:40:34 crc kubenswrapper[4752]: I0227 17:40:34.948654 4752 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:34 crc kubenswrapper[4752]: I0227 17:40:34.948673 4752 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:34 crc kubenswrapper[4752]: I0227 17:40:34.948682 4752 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:35 crc kubenswrapper[4752]: I0227 17:40:35.050045 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba-kube-api-access\") pod \"1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba\" (UID: \"1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba\") " Feb 27 17:40:35 crc kubenswrapper[4752]: I0227 17:40:35.050140 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba-kubelet-dir\") pod \"1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba\" (UID: \"1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba\") " Feb 27 17:40:35 crc kubenswrapper[4752]: I0227 17:40:35.050185 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba" (UID: "1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 17:40:35 crc kubenswrapper[4752]: I0227 17:40:35.050207 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba-var-lock\") pod \"1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba\" (UID: \"1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba\") " Feb 27 17:40:35 crc kubenswrapper[4752]: I0227 17:40:35.050286 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba-var-lock" (OuterVolumeSpecName: "var-lock") pod "1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba" (UID: "1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 17:40:35 crc kubenswrapper[4752]: I0227 17:40:35.050661 4752 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:35 crc kubenswrapper[4752]: I0227 17:40:35.050693 4752 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba-var-lock\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:35 crc kubenswrapper[4752]: I0227 17:40:35.058468 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba" (UID: "1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:40:35 crc kubenswrapper[4752]: I0227 17:40:35.151441 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:35 crc kubenswrapper[4752]: I0227 17:40:35.570738 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 27 17:40:35 crc kubenswrapper[4752]: I0227 17:40:35.571886 4752 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="269b9e60d7588f2345dfd731d886ef405b3ccb49e23ae2e676a3fa354a0c9d70" exitCode=0 Feb 27 17:40:35 crc kubenswrapper[4752]: I0227 17:40:35.571992 4752 scope.go:117] "RemoveContainer" containerID="ad318061566db9fcc86af89349dee0916a2f6344e6e374e356f5929c30b1150d" Feb 27 17:40:35 crc kubenswrapper[4752]: I0227 17:40:35.572009 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:40:35 crc kubenswrapper[4752]: I0227 17:40:35.574496 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba","Type":"ContainerDied","Data":"7a9af0f29c36227bf90295a4a21e46c3307a6d70e2fbee8b2a9d857f7a6703d3"} Feb 27 17:40:35 crc kubenswrapper[4752]: I0227 17:40:35.574522 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a9af0f29c36227bf90295a4a21e46c3307a6d70e2fbee8b2a9d857f7a6703d3" Feb 27 17:40:35 crc kubenswrapper[4752]: I0227 17:40:35.574535 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 27 17:40:35 crc kubenswrapper[4752]: I0227 17:40:35.581820 4752 generic.go:334] "Generic (PLEG): container finished" podID="cad177e6-5ee1-4884-bb19-b9413b183acc" containerID="7ba4699f47b6c884353dd851daac687b91155e0bc8b0e36934e7e1b0252f2253" exitCode=0 Feb 27 17:40:35 crc kubenswrapper[4752]: I0227 17:40:35.581948 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dm9bt" event={"ID":"cad177e6-5ee1-4884-bb19-b9413b183acc","Type":"ContainerDied","Data":"7ba4699f47b6c884353dd851daac687b91155e0bc8b0e36934e7e1b0252f2253"} Feb 27 17:40:35 crc kubenswrapper[4752]: E0227 17:40:35.589876 4752 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.102:6443: connect: connection refused" interval="3.2s" Feb 27 17:40:35 crc kubenswrapper[4752]: I0227 17:40:35.599543 4752 scope.go:117] "RemoveContainer" containerID="90c1fe8d41bdc04d8780aca50265c0e6cf43abbb1cda58aae8f78d54d7c52d2f" Feb 27 17:40:35 crc kubenswrapper[4752]: I0227 17:40:35.631305 4752 scope.go:117] "RemoveContainer" containerID="ba92a9228c8523b28a4a03f1f859339e21eac77405f37f5206e37ce4ee45d381" Feb 27 17:40:35 crc kubenswrapper[4752]: I0227 17:40:35.659871 4752 scope.go:117] "RemoveContainer" containerID="f9dd98b133cbae7c1dfa12a34dd6973dd4a5fc4340d34c1c4b3296d036456a82" Feb 27 17:40:35 crc kubenswrapper[4752]: I0227 17:40:35.684608 4752 scope.go:117] "RemoveContainer" containerID="269b9e60d7588f2345dfd731d886ef405b3ccb49e23ae2e676a3fa354a0c9d70" Feb 27 17:40:35 crc kubenswrapper[4752]: I0227 17:40:35.707813 4752 scope.go:117] "RemoveContainer" containerID="37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07" Feb 27 17:40:35 crc kubenswrapper[4752]: I0227 17:40:35.741012 4752 scope.go:117] "RemoveContainer" containerID="ad318061566db9fcc86af89349dee0916a2f6344e6e374e356f5929c30b1150d" Feb 27 17:40:35 crc kubenswrapper[4752]: E0227 17:40:35.741774 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad318061566db9fcc86af89349dee0916a2f6344e6e374e356f5929c30b1150d\": container with ID starting with ad318061566db9fcc86af89349dee0916a2f6344e6e374e356f5929c30b1150d not found: ID does not exist" containerID="ad318061566db9fcc86af89349dee0916a2f6344e6e374e356f5929c30b1150d" Feb 27 17:40:35 crc kubenswrapper[4752]: I0227 17:40:35.741808 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad318061566db9fcc86af89349dee0916a2f6344e6e374e356f5929c30b1150d"} err="failed to get container status \"ad318061566db9fcc86af89349dee0916a2f6344e6e374e356f5929c30b1150d\": rpc error: code = NotFound desc = could not find container \"ad318061566db9fcc86af89349dee0916a2f6344e6e374e356f5929c30b1150d\": container with ID starting with ad318061566db9fcc86af89349dee0916a2f6344e6e374e356f5929c30b1150d not found: ID does not exist" Feb 27 17:40:35 crc kubenswrapper[4752]: I0227 17:40:35.741833 4752 scope.go:117] "RemoveContainer" containerID="90c1fe8d41bdc04d8780aca50265c0e6cf43abbb1cda58aae8f78d54d7c52d2f" Feb 27 17:40:35 crc kubenswrapper[4752]: E0227 17:40:35.743055 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90c1fe8d41bdc04d8780aca50265c0e6cf43abbb1cda58aae8f78d54d7c52d2f\": container with ID starting with 90c1fe8d41bdc04d8780aca50265c0e6cf43abbb1cda58aae8f78d54d7c52d2f not found: ID does not exist" containerID="90c1fe8d41bdc04d8780aca50265c0e6cf43abbb1cda58aae8f78d54d7c52d2f" Feb 27 17:40:35 crc kubenswrapper[4752]: I0227 17:40:35.743084 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90c1fe8d41bdc04d8780aca50265c0e6cf43abbb1cda58aae8f78d54d7c52d2f"} err="failed to get container status \"90c1fe8d41bdc04d8780aca50265c0e6cf43abbb1cda58aae8f78d54d7c52d2f\": rpc error: code = NotFound desc = could not find container \"90c1fe8d41bdc04d8780aca50265c0e6cf43abbb1cda58aae8f78d54d7c52d2f\": container with ID starting with 90c1fe8d41bdc04d8780aca50265c0e6cf43abbb1cda58aae8f78d54d7c52d2f not found: ID does not exist" Feb 27 17:40:35 crc kubenswrapper[4752]: I0227 17:40:35.743103 4752 scope.go:117] "RemoveContainer" containerID="ba92a9228c8523b28a4a03f1f859339e21eac77405f37f5206e37ce4ee45d381" Feb 27 17:40:35 crc kubenswrapper[4752]: E0227 17:40:35.743629 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba92a9228c8523b28a4a03f1f859339e21eac77405f37f5206e37ce4ee45d381\": container with ID starting with ba92a9228c8523b28a4a03f1f859339e21eac77405f37f5206e37ce4ee45d381 not found: ID does not exist" containerID="ba92a9228c8523b28a4a03f1f859339e21eac77405f37f5206e37ce4ee45d381" Feb 27 17:40:35 crc kubenswrapper[4752]: I0227 17:40:35.743718 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba92a9228c8523b28a4a03f1f859339e21eac77405f37f5206e37ce4ee45d381"} err="failed to get container status \"ba92a9228c8523b28a4a03f1f859339e21eac77405f37f5206e37ce4ee45d381\": rpc error: code = NotFound desc = could not find container \"ba92a9228c8523b28a4a03f1f859339e21eac77405f37f5206e37ce4ee45d381\": container with ID starting with ba92a9228c8523b28a4a03f1f859339e21eac77405f37f5206e37ce4ee45d381 not found: ID does not exist" Feb 27 17:40:35 crc kubenswrapper[4752]: I0227 17:40:35.743762 4752 scope.go:117] "RemoveContainer" containerID="f9dd98b133cbae7c1dfa12a34dd6973dd4a5fc4340d34c1c4b3296d036456a82" Feb 27 17:40:35 crc kubenswrapper[4752]: E0227 17:40:35.744105 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9dd98b133cbae7c1dfa12a34dd6973dd4a5fc4340d34c1c4b3296d036456a82\": container with ID starting with f9dd98b133cbae7c1dfa12a34dd6973dd4a5fc4340d34c1c4b3296d036456a82 not found: ID does not exist" containerID="f9dd98b133cbae7c1dfa12a34dd6973dd4a5fc4340d34c1c4b3296d036456a82" Feb 27 17:40:35 crc kubenswrapper[4752]: I0227 17:40:35.744175 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9dd98b133cbae7c1dfa12a34dd6973dd4a5fc4340d34c1c4b3296d036456a82"} err="failed to get container status \"f9dd98b133cbae7c1dfa12a34dd6973dd4a5fc4340d34c1c4b3296d036456a82\": rpc error: code = NotFound desc = could not find container \"f9dd98b133cbae7c1dfa12a34dd6973dd4a5fc4340d34c1c4b3296d036456a82\": container with ID starting with f9dd98b133cbae7c1dfa12a34dd6973dd4a5fc4340d34c1c4b3296d036456a82 not found: ID does not exist" Feb 27 17:40:35 crc kubenswrapper[4752]: I0227 17:40:35.744203 4752 scope.go:117] "RemoveContainer" containerID="269b9e60d7588f2345dfd731d886ef405b3ccb49e23ae2e676a3fa354a0c9d70" Feb 27 17:40:35 crc kubenswrapper[4752]: E0227 17:40:35.744539 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"269b9e60d7588f2345dfd731d886ef405b3ccb49e23ae2e676a3fa354a0c9d70\": container with ID starting with 269b9e60d7588f2345dfd731d886ef405b3ccb49e23ae2e676a3fa354a0c9d70 not found: ID does not exist" containerID="269b9e60d7588f2345dfd731d886ef405b3ccb49e23ae2e676a3fa354a0c9d70" Feb 27 17:40:35 crc kubenswrapper[4752]: I0227 17:40:35.744577 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"269b9e60d7588f2345dfd731d886ef405b3ccb49e23ae2e676a3fa354a0c9d70"} err="failed to get container status \"269b9e60d7588f2345dfd731d886ef405b3ccb49e23ae2e676a3fa354a0c9d70\": rpc error: code = NotFound desc = could not find container \"269b9e60d7588f2345dfd731d886ef405b3ccb49e23ae2e676a3fa354a0c9d70\": container with ID starting with 269b9e60d7588f2345dfd731d886ef405b3ccb49e23ae2e676a3fa354a0c9d70 not found: ID does not exist" Feb 27 17:40:35 crc kubenswrapper[4752]: I0227 17:40:35.744603 4752 scope.go:117] "RemoveContainer" containerID="37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07" Feb 27 17:40:35 crc kubenswrapper[4752]: E0227 17:40:35.744869 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\": container with ID starting with 37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07 not found: ID does not exist" containerID="37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07" Feb 27 17:40:35 crc kubenswrapper[4752]: I0227 17:40:35.744910 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07"} err="failed to get container status \"37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\": rpc error: code = NotFound desc = could not find container \"37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07\": container with ID starting with 37e6f59dc322fc11484d6aa2dd5963793eb509e8557bfa7cc0e5825530f71b07 not found: ID does not exist" Feb 27 17:40:36 crc kubenswrapper[4752]: I0227 17:40:36.918309 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 27 17:40:37 crc kubenswrapper[4752]: I0227 17:40:37.391912 4752 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:37 crc kubenswrapper[4752]: I0227 17:40:37.392604 4752 status_manager.go:851] "Failed to get status for pod" podUID="570355a7-01e2-4064-ae41-f43709dc4b88" pod="openshift-route-controller-manager/route-controller-manager-764d847c9b-pcf5d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-764d847c9b-pcf5d\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:37 crc kubenswrapper[4752]: I0227 17:40:37.393090 4752 status_manager.go:851] "Failed to get status for pod" podUID="cad177e6-5ee1-4884-bb19-b9413b183acc" pod="openshift-marketplace/community-operators-dm9bt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-dm9bt\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:37 crc kubenswrapper[4752]: I0227 17:40:37.393827 4752 status_manager.go:851] "Failed to get status for pod" podUID="137184c7-4f82-4685-89fa-d5152358e216" pod="openshift-marketplace/redhat-operators-qvj4t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qvj4t\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:37 crc kubenswrapper[4752]: I0227 17:40:37.394340 4752 status_manager.go:851] "Failed to get status for pod" podUID="cc36acda-9447-479d-b741-c063ecb91f3e" pod="openshift-infra/auto-csr-approver-29536898-598km" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29536898-598km\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:37 crc kubenswrapper[4752]: I0227 17:40:37.394673 4752 status_manager.go:851] "Failed to get status for pod" podUID="1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:37 crc kubenswrapper[4752]: E0227 17:40:37.421124 4752 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.102:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 17:40:37 crc kubenswrapper[4752]: I0227 17:40:37.422374 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 17:40:37 crc kubenswrapper[4752]: W0227 17:40:37.442753 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-f1d3ada14d7f050d26cb32f8a21a3cf9bf48976f201d5e09a335246860832cff WatchSource:0}: Error finding container f1d3ada14d7f050d26cb32f8a21a3cf9bf48976f201d5e09a335246860832cff: Status 404 returned error can't find the container with id f1d3ada14d7f050d26cb32f8a21a3cf9bf48976f201d5e09a335246860832cff Feb 27 17:40:37 crc kubenswrapper[4752]: I0227 17:40:37.600293 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dm9bt" event={"ID":"cad177e6-5ee1-4884-bb19-b9413b183acc","Type":"ContainerStarted","Data":"2e123599c2c2d3b21164b0784b7b129fe10571f86105f71e3121be9af312e409"} Feb 27 17:40:37 crc kubenswrapper[4752]: I0227 17:40:37.601168 4752 status_manager.go:851] "Failed to get status for pod" podUID="1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:37 crc kubenswrapper[4752]: I0227 17:40:37.601575 4752 status_manager.go:851] "Failed to get status for pod" podUID="570355a7-01e2-4064-ae41-f43709dc4b88" pod="openshift-route-controller-manager/route-controller-manager-764d847c9b-pcf5d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-764d847c9b-pcf5d\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:37 crc kubenswrapper[4752]: I0227 17:40:37.601775 4752 status_manager.go:851] "Failed to get status for pod" podUID="cad177e6-5ee1-4884-bb19-b9413b183acc" pod="openshift-marketplace/community-operators-dm9bt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-dm9bt\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:37 crc kubenswrapper[4752]: I0227 17:40:37.601935 4752 status_manager.go:851] "Failed to get status for pod" podUID="137184c7-4f82-4685-89fa-d5152358e216" pod="openshift-marketplace/redhat-operators-qvj4t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qvj4t\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:37 crc kubenswrapper[4752]: I0227 17:40:37.602083 4752 status_manager.go:851] "Failed to get status for pod" podUID="cc36acda-9447-479d-b741-c063ecb91f3e" pod="openshift-infra/auto-csr-approver-29536898-598km" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29536898-598km\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:37 crc kubenswrapper[4752]: I0227 17:40:37.603560 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"f1d3ada14d7f050d26cb32f8a21a3cf9bf48976f201d5e09a335246860832cff"} Feb 27 17:40:38 crc kubenswrapper[4752]: I0227 17:40:38.613981 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"84c0b0a6827d2b9990e06b691d1ddf4609921756d5a3841c22e02554f89f4da7"} Feb 27 17:40:38 crc kubenswrapper[4752]: I0227 17:40:38.614791 4752 status_manager.go:851] "Failed to get status for pod" podUID="1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:38 crc kubenswrapper[4752]: E0227 17:40:38.614878 4752 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.102:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 17:40:38 crc kubenswrapper[4752]: I0227 17:40:38.615508 4752 status_manager.go:851] "Failed to get status for pod" podUID="570355a7-01e2-4064-ae41-f43709dc4b88" pod="openshift-route-controller-manager/route-controller-manager-764d847c9b-pcf5d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-764d847c9b-pcf5d\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:38 crc kubenswrapper[4752]: I0227 17:40:38.615984 4752 status_manager.go:851] "Failed to get status for pod" podUID="cad177e6-5ee1-4884-bb19-b9413b183acc" pod="openshift-marketplace/community-operators-dm9bt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-dm9bt\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:38 crc kubenswrapper[4752]: I0227 17:40:38.616314 4752 status_manager.go:851] "Failed to get status for pod" podUID="137184c7-4f82-4685-89fa-d5152358e216" pod="openshift-marketplace/redhat-operators-qvj4t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qvj4t\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:38 crc kubenswrapper[4752]: I0227 17:40:38.616594 4752 status_manager.go:851] "Failed to get status for pod" podUID="cc36acda-9447-479d-b741-c063ecb91f3e" pod="openshift-infra/auto-csr-approver-29536898-598km" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29536898-598km\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:38 crc kubenswrapper[4752]: E0227 17:40:38.791114 4752 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.102:6443: connect: connection refused" interval="6.4s" Feb 27 17:40:39 crc kubenswrapper[4752]: E0227 17:40:39.624425 4752 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.102:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 17:40:40 crc kubenswrapper[4752]: E0227 17:40:40.097337 4752 event.go:368] "Unable to write event (may retry after sleeping)" err="Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events/redhat-marketplace-zhhxr.18982b3fbd58ba3d\": dial tcp 38.102.83.102:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-marketplace-zhhxr.18982b3fbd58ba3d openshift-marketplace 29640 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-marketplace-zhhxr,UID:760298d8-7405-4c9e-b322-b08dbc182da8,APIVersion:v1,ResourceVersion:28444,FieldPath:spec.initContainers{extract-content},},Reason:Failed,Message:Failed to pull image \"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\": copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-27 17:40:06 +0000 UTC,LastTimestamp:2026-02-27 17:40:32.495368902 +0000 UTC m=+332.402185753,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 27 17:40:40 crc kubenswrapper[4752]: I0227 17:40:40.820640 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dm9bt" Feb 27 17:40:40 crc kubenswrapper[4752]: I0227 17:40:40.821059 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dm9bt" Feb 27 17:40:40 crc kubenswrapper[4752]: I0227 17:40:40.890687 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dm9bt" Feb 27 17:40:40 crc kubenswrapper[4752]: I0227 17:40:40.891390 4752 status_manager.go:851] "Failed to get status for pod" podUID="137184c7-4f82-4685-89fa-d5152358e216" pod="openshift-marketplace/redhat-operators-qvj4t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qvj4t\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:40 crc kubenswrapper[4752]: I0227 17:40:40.891972 4752 status_manager.go:851] "Failed to get status for pod" podUID="cc36acda-9447-479d-b741-c063ecb91f3e" pod="openshift-infra/auto-csr-approver-29536898-598km" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29536898-598km\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:40 crc kubenswrapper[4752]: I0227 17:40:40.892421 4752 status_manager.go:851] "Failed to get status for pod" podUID="1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:40 crc kubenswrapper[4752]: I0227 17:40:40.892796 4752 status_manager.go:851] "Failed to get status for pod" podUID="570355a7-01e2-4064-ae41-f43709dc4b88" pod="openshift-route-controller-manager/route-controller-manager-764d847c9b-pcf5d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-764d847c9b-pcf5d\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:40 crc kubenswrapper[4752]: I0227 17:40:40.893335 4752 status_manager.go:851] "Failed to get status for pod" podUID="cad177e6-5ee1-4884-bb19-b9413b183acc" pod="openshift-marketplace/community-operators-dm9bt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-dm9bt\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:40 crc kubenswrapper[4752]: I0227 17:40:40.909610 4752 status_manager.go:851] "Failed to get status for pod" podUID="570355a7-01e2-4064-ae41-f43709dc4b88" pod="openshift-route-controller-manager/route-controller-manager-764d847c9b-pcf5d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-764d847c9b-pcf5d\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:40 crc kubenswrapper[4752]: I0227 17:40:40.910021 4752 status_manager.go:851] "Failed to get status for pod" podUID="cad177e6-5ee1-4884-bb19-b9413b183acc" pod="openshift-marketplace/community-operators-dm9bt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-dm9bt\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:40 crc kubenswrapper[4752]: I0227 17:40:40.910514 4752 status_manager.go:851] "Failed to get status for pod" podUID="137184c7-4f82-4685-89fa-d5152358e216" pod="openshift-marketplace/redhat-operators-qvj4t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qvj4t\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:40 crc kubenswrapper[4752]: I0227 17:40:40.910912 4752 status_manager.go:851] "Failed to get status for pod" podUID="cc36acda-9447-479d-b741-c063ecb91f3e" pod="openshift-infra/auto-csr-approver-29536898-598km" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29536898-598km\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:40 crc kubenswrapper[4752]: I0227 17:40:40.911472 4752 status_manager.go:851] "Failed to get status for pod" podUID="1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:40 crc kubenswrapper[4752]: I0227 17:40:40.912066 4752 status_manager.go:851] "Failed to get status for pod" podUID="570355a7-01e2-4064-ae41-f43709dc4b88" pod="openshift-route-controller-manager/route-controller-manager-764d847c9b-pcf5d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-764d847c9b-pcf5d\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:40 crc kubenswrapper[4752]: I0227 17:40:40.912909 4752 status_manager.go:851] "Failed to get status for pod" podUID="cad177e6-5ee1-4884-bb19-b9413b183acc" pod="openshift-marketplace/community-operators-dm9bt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-dm9bt\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:40 crc kubenswrapper[4752]: I0227 17:40:40.913500 4752 status_manager.go:851] "Failed to get status for pod" podUID="78323811-0abf-4cc6-921c-5d0e56e895a3" pod="openshift-marketplace/redhat-marketplace-6kwhk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6kwhk\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:40 crc kubenswrapper[4752]: E0227 17:40:40.914030 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-6kwhk" podUID="78323811-0abf-4cc6-921c-5d0e56e895a3" Feb 27 17:40:40 crc kubenswrapper[4752]: I0227 17:40:40.914132 4752 status_manager.go:851] "Failed to get status for pod" podUID="137184c7-4f82-4685-89fa-d5152358e216" pod="openshift-marketplace/redhat-operators-qvj4t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qvj4t\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:40 crc kubenswrapper[4752]: I0227 17:40:40.914637 4752 status_manager.go:851] "Failed to get status for pod" podUID="cc36acda-9447-479d-b741-c063ecb91f3e" pod="openshift-infra/auto-csr-approver-29536898-598km" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29536898-598km\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:40 crc kubenswrapper[4752]: I0227 17:40:40.915050 4752 status_manager.go:851] "Failed to get status for pod" podUID="1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:40 crc kubenswrapper[4752]: I0227 17:40:40.915401 4752 status_manager.go:851] "Failed to get status for pod" podUID="4c7d2d1c-023b-43e1-9015-5b572f4648cf" pod="openshift-infra/auto-csr-approver-29536900-5b8dn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29536900-5b8dn\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:41 crc kubenswrapper[4752]: I0227 17:40:41.642053 4752 generic.go:334] "Generic (PLEG): container finished" podID="899d1101-b4de-4326-b442-6450903b2a30" containerID="e4d188dc042311fb27dd7900e2bb57c7bbf3adb883157a4b72f817d21418a554" exitCode=0 Feb 27 17:40:41 crc kubenswrapper[4752]: I0227 17:40:41.642406 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b7x2z" event={"ID":"899d1101-b4de-4326-b442-6450903b2a30","Type":"ContainerDied","Data":"e4d188dc042311fb27dd7900e2bb57c7bbf3adb883157a4b72f817d21418a554"} Feb 27 17:40:41 crc kubenswrapper[4752]: I0227 17:40:41.643434 4752 status_manager.go:851] "Failed to get status for pod" podUID="4c7d2d1c-023b-43e1-9015-5b572f4648cf" pod="openshift-infra/auto-csr-approver-29536900-5b8dn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29536900-5b8dn\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:41 crc kubenswrapper[4752]: I0227 17:40:41.644043 4752 status_manager.go:851] "Failed to get status for pod" podUID="570355a7-01e2-4064-ae41-f43709dc4b88" pod="openshift-route-controller-manager/route-controller-manager-764d847c9b-pcf5d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-764d847c9b-pcf5d\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:41 crc kubenswrapper[4752]: I0227 17:40:41.644271 4752 status_manager.go:851] "Failed to get status for pod" podUID="cad177e6-5ee1-4884-bb19-b9413b183acc" pod="openshift-marketplace/community-operators-dm9bt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-dm9bt\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:41 crc kubenswrapper[4752]: I0227 17:40:41.644655 4752 status_manager.go:851] "Failed to get status for pod" podUID="78323811-0abf-4cc6-921c-5d0e56e895a3" pod="openshift-marketplace/redhat-marketplace-6kwhk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6kwhk\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:41 crc kubenswrapper[4752]: I0227 17:40:41.645382 4752 status_manager.go:851] "Failed to get status for pod" podUID="137184c7-4f82-4685-89fa-d5152358e216" pod="openshift-marketplace/redhat-operators-qvj4t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qvj4t\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:41 crc kubenswrapper[4752]: I0227 17:40:41.645861 4752 status_manager.go:851] "Failed to get status for pod" podUID="cc36acda-9447-479d-b741-c063ecb91f3e" pod="openshift-infra/auto-csr-approver-29536898-598km" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29536898-598km\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:41 crc kubenswrapper[4752]: I0227 17:40:41.646272 4752 status_manager.go:851] "Failed to get status for pod" podUID="899d1101-b4de-4326-b442-6450903b2a30" pod="openshift-marketplace/community-operators-b7x2z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-b7x2z\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:41 crc kubenswrapper[4752]: I0227 17:40:41.647590 4752 status_manager.go:851] "Failed to get status for pod" podUID="1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:41 crc kubenswrapper[4752]: I0227 17:40:41.708547 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dm9bt" Feb 27 17:40:41 crc kubenswrapper[4752]: I0227 17:40:41.709621 4752 status_manager.go:851] "Failed to get status for pod" podUID="1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:41 crc kubenswrapper[4752]: I0227 17:40:41.710288 4752 status_manager.go:851] "Failed to get status for pod" podUID="4c7d2d1c-023b-43e1-9015-5b572f4648cf" pod="openshift-infra/auto-csr-approver-29536900-5b8dn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29536900-5b8dn\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:41 crc kubenswrapper[4752]: I0227 17:40:41.710895 4752 status_manager.go:851] "Failed to get status for pod" podUID="570355a7-01e2-4064-ae41-f43709dc4b88" pod="openshift-route-controller-manager/route-controller-manager-764d847c9b-pcf5d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-764d847c9b-pcf5d\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:41 crc kubenswrapper[4752]: I0227 17:40:41.711471 4752 status_manager.go:851] "Failed to get status for pod" podUID="cad177e6-5ee1-4884-bb19-b9413b183acc" pod="openshift-marketplace/community-operators-dm9bt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-dm9bt\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:41 crc kubenswrapper[4752]: I0227 17:40:41.711969 4752 status_manager.go:851] "Failed to get status for pod" podUID="78323811-0abf-4cc6-921c-5d0e56e895a3" pod="openshift-marketplace/redhat-marketplace-6kwhk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6kwhk\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:41 crc kubenswrapper[4752]: I0227 17:40:41.712479 4752 status_manager.go:851] "Failed to get status for pod" podUID="137184c7-4f82-4685-89fa-d5152358e216" pod="openshift-marketplace/redhat-operators-qvj4t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qvj4t\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:41 crc kubenswrapper[4752]: I0227 17:40:41.713260 4752 status_manager.go:851] "Failed to get status for pod" podUID="cc36acda-9447-479d-b741-c063ecb91f3e" pod="openshift-infra/auto-csr-approver-29536898-598km" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29536898-598km\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:41 crc kubenswrapper[4752]: I0227 17:40:41.713738 4752 status_manager.go:851] "Failed to get status for pod" podUID="899d1101-b4de-4326-b442-6450903b2a30" pod="openshift-marketplace/community-operators-b7x2z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-b7x2z\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:42 crc kubenswrapper[4752]: E0227 17:40:42.968506 4752 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.102:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" volumeName="registry-storage" Feb 27 17:40:43 crc kubenswrapper[4752]: I0227 17:40:43.744549 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qvj4t" Feb 27 17:40:43 crc kubenswrapper[4752]: I0227 17:40:43.745355 4752 status_manager.go:851] "Failed to get status for pod" podUID="78323811-0abf-4cc6-921c-5d0e56e895a3" pod="openshift-marketplace/redhat-marketplace-6kwhk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6kwhk\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:43 crc kubenswrapper[4752]: I0227 17:40:43.745762 4752 status_manager.go:851] "Failed to get status for pod" podUID="137184c7-4f82-4685-89fa-d5152358e216" pod="openshift-marketplace/redhat-operators-qvj4t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qvj4t\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:43 crc kubenswrapper[4752]: I0227 17:40:43.746354 4752 status_manager.go:851] "Failed to get status for pod" podUID="cc36acda-9447-479d-b741-c063ecb91f3e" pod="openshift-infra/auto-csr-approver-29536898-598km" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29536898-598km\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:43 crc kubenswrapper[4752]: I0227 17:40:43.746908 4752 status_manager.go:851] "Failed to get status for pod" podUID="899d1101-b4de-4326-b442-6450903b2a30" pod="openshift-marketplace/community-operators-b7x2z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-b7x2z\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:43 crc kubenswrapper[4752]: I0227 17:40:43.747394 4752 status_manager.go:851] "Failed to get status for pod" podUID="1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:43 crc kubenswrapper[4752]: I0227 17:40:43.747927 4752 status_manager.go:851] "Failed to get status for pod" podUID="4c7d2d1c-023b-43e1-9015-5b572f4648cf" pod="openshift-infra/auto-csr-approver-29536900-5b8dn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29536900-5b8dn\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:43 crc kubenswrapper[4752]: I0227 17:40:43.748502 4752 status_manager.go:851] "Failed to get status for pod" podUID="570355a7-01e2-4064-ae41-f43709dc4b88" pod="openshift-route-controller-manager/route-controller-manager-764d847c9b-pcf5d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-764d847c9b-pcf5d\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:43 crc kubenswrapper[4752]: I0227 17:40:43.748956 4752 status_manager.go:851] "Failed to get status for pod" podUID="cad177e6-5ee1-4884-bb19-b9413b183acc" pod="openshift-marketplace/community-operators-dm9bt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-dm9bt\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:43 crc kubenswrapper[4752]: I0227 17:40:43.805722 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qvj4t" Feb 27 17:40:43 crc kubenswrapper[4752]: I0227 17:40:43.806345 4752 status_manager.go:851] "Failed to get status for pod" podUID="cad177e6-5ee1-4884-bb19-b9413b183acc" pod="openshift-marketplace/community-operators-dm9bt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-dm9bt\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:43 crc kubenswrapper[4752]: I0227 17:40:43.806662 4752 status_manager.go:851] "Failed to get status for pod" podUID="78323811-0abf-4cc6-921c-5d0e56e895a3" pod="openshift-marketplace/redhat-marketplace-6kwhk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6kwhk\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:43 crc kubenswrapper[4752]: I0227 17:40:43.806982 4752 status_manager.go:851] "Failed to get status for pod" podUID="137184c7-4f82-4685-89fa-d5152358e216" pod="openshift-marketplace/redhat-operators-qvj4t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qvj4t\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:43 crc kubenswrapper[4752]: I0227 17:40:43.807497 4752 status_manager.go:851] "Failed to get status for pod" podUID="cc36acda-9447-479d-b741-c063ecb91f3e" pod="openshift-infra/auto-csr-approver-29536898-598km" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29536898-598km\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:43 crc kubenswrapper[4752]: I0227 17:40:43.808298 4752 status_manager.go:851] "Failed to get status for pod" podUID="899d1101-b4de-4326-b442-6450903b2a30" pod="openshift-marketplace/community-operators-b7x2z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-b7x2z\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:43 crc kubenswrapper[4752]: I0227 17:40:43.808912 4752 status_manager.go:851] "Failed to get status for pod" podUID="1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:43 crc kubenswrapper[4752]: I0227 17:40:43.809367 4752 status_manager.go:851] "Failed to get status for pod" podUID="4c7d2d1c-023b-43e1-9015-5b572f4648cf" pod="openshift-infra/auto-csr-approver-29536900-5b8dn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29536900-5b8dn\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:43 crc kubenswrapper[4752]: I0227 17:40:43.810088 4752 status_manager.go:851] "Failed to get status for pod" podUID="570355a7-01e2-4064-ae41-f43709dc4b88" pod="openshift-route-controller-manager/route-controller-manager-764d847c9b-pcf5d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-764d847c9b-pcf5d\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:45 crc kubenswrapper[4752]: E0227 17:40:45.192214 4752 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.102:6443: connect: connection refused" interval="7s" Feb 27 17:40:45 crc kubenswrapper[4752]: I0227 17:40:45.673187 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b7x2z" event={"ID":"899d1101-b4de-4326-b442-6450903b2a30","Type":"ContainerStarted","Data":"984ceef2fb711d42d08bb0a2d6a63af0580b9abdaeeb8fc873b503f3824870c3"} Feb 27 17:40:45 crc kubenswrapper[4752]: I0227 17:40:45.674269 4752 status_manager.go:851] "Failed to get status for pod" podUID="78323811-0abf-4cc6-921c-5d0e56e895a3" pod="openshift-marketplace/redhat-marketplace-6kwhk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6kwhk\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:45 crc kubenswrapper[4752]: I0227 17:40:45.674694 4752 status_manager.go:851] "Failed to get status for pod" podUID="137184c7-4f82-4685-89fa-d5152358e216" pod="openshift-marketplace/redhat-operators-qvj4t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qvj4t\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:45 crc kubenswrapper[4752]: I0227 17:40:45.675100 4752 status_manager.go:851] "Failed to get status for pod" podUID="cc36acda-9447-479d-b741-c063ecb91f3e" pod="openshift-infra/auto-csr-approver-29536898-598km" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29536898-598km\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:45 crc kubenswrapper[4752]: I0227 17:40:45.675815 4752 status_manager.go:851] "Failed to get status for pod" podUID="899d1101-b4de-4326-b442-6450903b2a30" pod="openshift-marketplace/community-operators-b7x2z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-b7x2z\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:45 crc kubenswrapper[4752]: I0227 17:40:45.676306 4752 status_manager.go:851] "Failed to get status for pod" podUID="1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:45 crc kubenswrapper[4752]: I0227 17:40:45.676746 4752 status_manager.go:851] "Failed to get status for pod" podUID="4c7d2d1c-023b-43e1-9015-5b572f4648cf" pod="openshift-infra/auto-csr-approver-29536900-5b8dn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29536900-5b8dn\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:45 crc kubenswrapper[4752]: I0227 17:40:45.677208 4752 status_manager.go:851] "Failed to get status for pod" podUID="570355a7-01e2-4064-ae41-f43709dc4b88" pod="openshift-route-controller-manager/route-controller-manager-764d847c9b-pcf5d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-764d847c9b-pcf5d\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:45 crc kubenswrapper[4752]: I0227 17:40:45.677245 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536900-5b8dn" event={"ID":"4c7d2d1c-023b-43e1-9015-5b572f4648cf","Type":"ContainerStarted","Data":"6a5e666e4d0f413f0dd7cd7a6f980f7e2bd91a7aa2dcdfddb7d828703dd53103"} Feb 27 17:40:45 crc kubenswrapper[4752]: I0227 17:40:45.677657 4752 status_manager.go:851] "Failed to get status for pod" podUID="cad177e6-5ee1-4884-bb19-b9413b183acc" pod="openshift-marketplace/community-operators-dm9bt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-dm9bt\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:45 crc kubenswrapper[4752]: I0227 17:40:45.678265 4752 status_manager.go:851] "Failed to get status for pod" podUID="78323811-0abf-4cc6-921c-5d0e56e895a3" pod="openshift-marketplace/redhat-marketplace-6kwhk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6kwhk\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:45 crc kubenswrapper[4752]: I0227 17:40:45.678897 4752 status_manager.go:851] "Failed to get status for pod" podUID="137184c7-4f82-4685-89fa-d5152358e216" pod="openshift-marketplace/redhat-operators-qvj4t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qvj4t\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:45 crc kubenswrapper[4752]: I0227 17:40:45.679364 4752 status_manager.go:851] "Failed to get status for pod" podUID="cc36acda-9447-479d-b741-c063ecb91f3e" pod="openshift-infra/auto-csr-approver-29536898-598km" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29536898-598km\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:45 crc kubenswrapper[4752]: I0227 17:40:45.679740 4752 status_manager.go:851] "Failed to get status for pod" podUID="899d1101-b4de-4326-b442-6450903b2a30" pod="openshift-marketplace/community-operators-b7x2z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-b7x2z\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:45 crc kubenswrapper[4752]: I0227 17:40:45.680215 4752 status_manager.go:851] "Failed to get status for pod" podUID="1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:45 crc kubenswrapper[4752]: I0227 17:40:45.680594 4752 status_manager.go:851] "Failed to get status for pod" podUID="4c7d2d1c-023b-43e1-9015-5b572f4648cf" pod="openshift-infra/auto-csr-approver-29536900-5b8dn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29536900-5b8dn\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:45 crc kubenswrapper[4752]: I0227 17:40:45.680939 4752 status_manager.go:851] "Failed to get status for pod" podUID="570355a7-01e2-4064-ae41-f43709dc4b88" pod="openshift-route-controller-manager/route-controller-manager-764d847c9b-pcf5d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-764d847c9b-pcf5d\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:45 crc kubenswrapper[4752]: I0227 17:40:45.681372 4752 status_manager.go:851] "Failed to get status for pod" podUID="cad177e6-5ee1-4884-bb19-b9413b183acc" pod="openshift-marketplace/community-operators-dm9bt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-dm9bt\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:46 crc kubenswrapper[4752]: I0227 17:40:46.688746 4752 generic.go:334] "Generic (PLEG): container finished" podID="4c7d2d1c-023b-43e1-9015-5b572f4648cf" containerID="6a5e666e4d0f413f0dd7cd7a6f980f7e2bd91a7aa2dcdfddb7d828703dd53103" exitCode=0 Feb 27 17:40:46 crc kubenswrapper[4752]: I0227 17:40:46.688862 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536900-5b8dn" event={"ID":"4c7d2d1c-023b-43e1-9015-5b572f4648cf","Type":"ContainerDied","Data":"6a5e666e4d0f413f0dd7cd7a6f980f7e2bd91a7aa2dcdfddb7d828703dd53103"} Feb 27 17:40:46 crc kubenswrapper[4752]: I0227 17:40:46.689791 4752 status_manager.go:851] "Failed to get status for pod" podUID="570355a7-01e2-4064-ae41-f43709dc4b88" pod="openshift-route-controller-manager/route-controller-manager-764d847c9b-pcf5d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-764d847c9b-pcf5d\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:46 crc kubenswrapper[4752]: I0227 17:40:46.690317 4752 status_manager.go:851] "Failed to get status for pod" podUID="cad177e6-5ee1-4884-bb19-b9413b183acc" pod="openshift-marketplace/community-operators-dm9bt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-dm9bt\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:46 crc kubenswrapper[4752]: I0227 17:40:46.690920 4752 status_manager.go:851] "Failed to get status for pod" podUID="78323811-0abf-4cc6-921c-5d0e56e895a3" pod="openshift-marketplace/redhat-marketplace-6kwhk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6kwhk\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:46 crc kubenswrapper[4752]: I0227 17:40:46.691350 4752 status_manager.go:851] "Failed to get status for pod" podUID="137184c7-4f82-4685-89fa-d5152358e216" pod="openshift-marketplace/redhat-operators-qvj4t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qvj4t\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:46 crc kubenswrapper[4752]: I0227 17:40:46.691731 4752 status_manager.go:851] "Failed to get status for pod" podUID="cc36acda-9447-479d-b741-c063ecb91f3e" pod="openshift-infra/auto-csr-approver-29536898-598km" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29536898-598km\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:46 crc kubenswrapper[4752]: I0227 17:40:46.692042 4752 status_manager.go:851] "Failed to get status for pod" podUID="899d1101-b4de-4326-b442-6450903b2a30" pod="openshift-marketplace/community-operators-b7x2z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-b7x2z\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:46 crc kubenswrapper[4752]: I0227 17:40:46.692386 4752 status_manager.go:851] "Failed to get status for pod" podUID="1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:46 crc kubenswrapper[4752]: I0227 17:40:46.692874 4752 status_manager.go:851] "Failed to get status for pod" podUID="4c7d2d1c-023b-43e1-9015-5b572f4648cf" pod="openshift-infra/auto-csr-approver-29536900-5b8dn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29536900-5b8dn\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:46 crc kubenswrapper[4752]: I0227 17:40:46.907334 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:40:46 crc kubenswrapper[4752]: I0227 17:40:46.908584 4752 status_manager.go:851] "Failed to get status for pod" podUID="137184c7-4f82-4685-89fa-d5152358e216" pod="openshift-marketplace/redhat-operators-qvj4t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qvj4t\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:46 crc kubenswrapper[4752]: E0227 17:40:46.909814 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-zhhxr" podUID="760298d8-7405-4c9e-b322-b08dbc182da8" Feb 27 17:40:46 crc kubenswrapper[4752]: I0227 17:40:46.910071 4752 status_manager.go:851] "Failed to get status for pod" podUID="cc36acda-9447-479d-b741-c063ecb91f3e" pod="openshift-infra/auto-csr-approver-29536898-598km" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29536898-598km\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:46 crc kubenswrapper[4752]: I0227 17:40:46.910674 4752 status_manager.go:851] "Failed to get status for pod" podUID="899d1101-b4de-4326-b442-6450903b2a30" pod="openshift-marketplace/community-operators-b7x2z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-b7x2z\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:46 crc kubenswrapper[4752]: I0227 17:40:46.911021 4752 status_manager.go:851] "Failed to get status for pod" podUID="1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:46 crc kubenswrapper[4752]: I0227 17:40:46.911532 4752 status_manager.go:851] "Failed to get status for pod" podUID="4c7d2d1c-023b-43e1-9015-5b572f4648cf" pod="openshift-infra/auto-csr-approver-29536900-5b8dn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29536900-5b8dn\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:46 crc kubenswrapper[4752]: I0227 17:40:46.912051 4752 status_manager.go:851] "Failed to get status for pod" podUID="760298d8-7405-4c9e-b322-b08dbc182da8" pod="openshift-marketplace/redhat-marketplace-zhhxr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-zhhxr\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:46 crc kubenswrapper[4752]: I0227 17:40:46.912464 4752 status_manager.go:851] "Failed to get status for pod" podUID="570355a7-01e2-4064-ae41-f43709dc4b88" pod="openshift-route-controller-manager/route-controller-manager-764d847c9b-pcf5d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-764d847c9b-pcf5d\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:46 crc kubenswrapper[4752]: I0227 17:40:46.912893 4752 status_manager.go:851] "Failed to get status for pod" podUID="cad177e6-5ee1-4884-bb19-b9413b183acc" pod="openshift-marketplace/community-operators-dm9bt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-dm9bt\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:46 crc kubenswrapper[4752]: I0227 17:40:46.913433 4752 status_manager.go:851] "Failed to get status for pod" podUID="78323811-0abf-4cc6-921c-5d0e56e895a3" pod="openshift-marketplace/redhat-marketplace-6kwhk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6kwhk\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:46 crc kubenswrapper[4752]: I0227 17:40:46.913921 4752 status_manager.go:851] "Failed to get status for pod" podUID="78323811-0abf-4cc6-921c-5d0e56e895a3" pod="openshift-marketplace/redhat-marketplace-6kwhk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6kwhk\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:46 crc kubenswrapper[4752]: I0227 17:40:46.914429 4752 status_manager.go:851] "Failed to get status for pod" podUID="137184c7-4f82-4685-89fa-d5152358e216" pod="openshift-marketplace/redhat-operators-qvj4t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qvj4t\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:46 crc kubenswrapper[4752]: I0227 17:40:46.915066 4752 status_manager.go:851] "Failed to get status for pod" podUID="cc36acda-9447-479d-b741-c063ecb91f3e" pod="openshift-infra/auto-csr-approver-29536898-598km" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29536898-598km\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:46 crc kubenswrapper[4752]: I0227 17:40:46.915700 4752 status_manager.go:851] "Failed to get status for pod" podUID="899d1101-b4de-4326-b442-6450903b2a30" pod="openshift-marketplace/community-operators-b7x2z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-b7x2z\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:46 crc kubenswrapper[4752]: I0227 17:40:46.916121 4752 status_manager.go:851] "Failed to get status for pod" podUID="1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:46 crc kubenswrapper[4752]: I0227 17:40:46.916458 4752 status_manager.go:851] "Failed to get status for pod" podUID="4c7d2d1c-023b-43e1-9015-5b572f4648cf" pod="openshift-infra/auto-csr-approver-29536900-5b8dn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29536900-5b8dn\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:46 crc kubenswrapper[4752]: I0227 17:40:46.916808 4752 status_manager.go:851] "Failed to get status for pod" podUID="760298d8-7405-4c9e-b322-b08dbc182da8" pod="openshift-marketplace/redhat-marketplace-zhhxr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-zhhxr\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:46 crc kubenswrapper[4752]: I0227 17:40:46.917371 4752 status_manager.go:851] "Failed to get status for pod" podUID="570355a7-01e2-4064-ae41-f43709dc4b88" pod="openshift-route-controller-manager/route-controller-manager-764d847c9b-pcf5d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-764d847c9b-pcf5d\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:46 crc kubenswrapper[4752]: I0227 17:40:46.917829 4752 status_manager.go:851] "Failed to get status for pod" podUID="cad177e6-5ee1-4884-bb19-b9413b183acc" pod="openshift-marketplace/community-operators-dm9bt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-dm9bt\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:46 crc kubenswrapper[4752]: I0227 17:40:46.933267 4752 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d1024425-74cb-401d-961a-72058a77a919" Feb 27 17:40:46 crc kubenswrapper[4752]: I0227 17:40:46.933308 4752 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d1024425-74cb-401d-961a-72058a77a919" Feb 27 17:40:46 crc kubenswrapper[4752]: E0227 17:40:46.933822 4752 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.102:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:40:46 crc kubenswrapper[4752]: I0227 17:40:46.934435 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:40:46 crc kubenswrapper[4752]: W0227 17:40:46.965899 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-2634d0728f3bc693de2709615128fd9e1fc6f2001d521c954973970330a18893 WatchSource:0}: Error finding container 2634d0728f3bc693de2709615128fd9e1fc6f2001d521c954973970330a18893: Status 404 returned error can't find the container with id 2634d0728f3bc693de2709615128fd9e1fc6f2001d521c954973970330a18893 Feb 27 17:40:47 crc kubenswrapper[4752]: I0227 17:40:47.698012 4752 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="2306bccbcf6e627b422a0e6b905e9a87653bf14cb9e3daabe93fe4ab58866181" exitCode=0 Feb 27 17:40:47 crc kubenswrapper[4752]: I0227 17:40:47.698116 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"2306bccbcf6e627b422a0e6b905e9a87653bf14cb9e3daabe93fe4ab58866181"} Feb 27 17:40:47 crc kubenswrapper[4752]: I0227 17:40:47.698203 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2634d0728f3bc693de2709615128fd9e1fc6f2001d521c954973970330a18893"} Feb 27 17:40:47 crc kubenswrapper[4752]: I0227 17:40:47.698612 4752 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d1024425-74cb-401d-961a-72058a77a919" Feb 27 17:40:47 crc kubenswrapper[4752]: I0227 17:40:47.698635 4752 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d1024425-74cb-401d-961a-72058a77a919" Feb 27 17:40:47 crc kubenswrapper[4752]: E0227 17:40:47.699428 4752 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.102:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:40:47 crc kubenswrapper[4752]: I0227 17:40:47.699439 4752 status_manager.go:851] "Failed to get status for pod" podUID="cad177e6-5ee1-4884-bb19-b9413b183acc" pod="openshift-marketplace/community-operators-dm9bt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-dm9bt\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:47 crc kubenswrapper[4752]: I0227 17:40:47.700094 4752 status_manager.go:851] "Failed to get status for pod" podUID="78323811-0abf-4cc6-921c-5d0e56e895a3" pod="openshift-marketplace/redhat-marketplace-6kwhk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6kwhk\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:47 crc kubenswrapper[4752]: I0227 17:40:47.700639 4752 status_manager.go:851] "Failed to get status for pod" podUID="137184c7-4f82-4685-89fa-d5152358e216" pod="openshift-marketplace/redhat-operators-qvj4t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qvj4t\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:47 crc kubenswrapper[4752]: I0227 17:40:47.701249 4752 status_manager.go:851] "Failed to get status for pod" podUID="cc36acda-9447-479d-b741-c063ecb91f3e" pod="openshift-infra/auto-csr-approver-29536898-598km" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29536898-598km\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:47 crc kubenswrapper[4752]: I0227 17:40:47.701729 4752 status_manager.go:851] "Failed to get status for pod" podUID="899d1101-b4de-4326-b442-6450903b2a30" pod="openshift-marketplace/community-operators-b7x2z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-b7x2z\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:47 crc kubenswrapper[4752]: I0227 17:40:47.702239 4752 status_manager.go:851] "Failed to get status for pod" podUID="1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:47 crc kubenswrapper[4752]: I0227 17:40:47.702714 4752 status_manager.go:851] "Failed to get status for pod" podUID="4c7d2d1c-023b-43e1-9015-5b572f4648cf" pod="openshift-infra/auto-csr-approver-29536900-5b8dn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29536900-5b8dn\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:47 crc kubenswrapper[4752]: I0227 17:40:47.702743 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 27 17:40:47 crc kubenswrapper[4752]: I0227 17:40:47.703220 4752 status_manager.go:851] "Failed to get status for pod" podUID="760298d8-7405-4c9e-b322-b08dbc182da8" pod="openshift-marketplace/redhat-marketplace-zhhxr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-zhhxr\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:47 crc kubenswrapper[4752]: I0227 17:40:47.703698 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 27 17:40:47 crc kubenswrapper[4752]: I0227 17:40:47.703784 4752 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="b07158ca3ff468fe4b8147ba5f4a6ec9e775d214521e3d5a99d9e7dde15e760a" exitCode=1 Feb 27 17:40:47 crc kubenswrapper[4752]: I0227 17:40:47.703848 4752 status_manager.go:851] "Failed to get status for pod" podUID="570355a7-01e2-4064-ae41-f43709dc4b88" pod="openshift-route-controller-manager/route-controller-manager-764d847c9b-pcf5d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-764d847c9b-pcf5d\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:47 crc kubenswrapper[4752]: I0227 17:40:47.703883 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"b07158ca3ff468fe4b8147ba5f4a6ec9e775d214521e3d5a99d9e7dde15e760a"} Feb 27 17:40:47 crc kubenswrapper[4752]: I0227 17:40:47.704735 4752 scope.go:117] "RemoveContainer" containerID="b07158ca3ff468fe4b8147ba5f4a6ec9e775d214521e3d5a99d9e7dde15e760a" Feb 27 17:40:47 crc kubenswrapper[4752]: I0227 17:40:47.705001 4752 status_manager.go:851] "Failed to get status for pod" podUID="760298d8-7405-4c9e-b322-b08dbc182da8" pod="openshift-marketplace/redhat-marketplace-zhhxr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-zhhxr\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:47 crc kubenswrapper[4752]: I0227 17:40:47.706986 4752 status_manager.go:851] "Failed to get status for pod" podUID="570355a7-01e2-4064-ae41-f43709dc4b88" pod="openshift-route-controller-manager/route-controller-manager-764d847c9b-pcf5d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-764d847c9b-pcf5d\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:47 crc kubenswrapper[4752]: I0227 17:40:47.707502 4752 status_manager.go:851] "Failed to get status for pod" podUID="cad177e6-5ee1-4884-bb19-b9413b183acc" pod="openshift-marketplace/community-operators-dm9bt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-dm9bt\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:47 crc kubenswrapper[4752]: I0227 17:40:47.709061 4752 status_manager.go:851] "Failed to get status for pod" podUID="78323811-0abf-4cc6-921c-5d0e56e895a3" pod="openshift-marketplace/redhat-marketplace-6kwhk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6kwhk\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:47 crc kubenswrapper[4752]: I0227 17:40:47.709798 4752 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:47 crc kubenswrapper[4752]: I0227 17:40:47.710390 4752 status_manager.go:851] "Failed to get status for pod" podUID="137184c7-4f82-4685-89fa-d5152358e216" pod="openshift-marketplace/redhat-operators-qvj4t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qvj4t\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:47 crc kubenswrapper[4752]: I0227 17:40:47.710861 4752 status_manager.go:851] "Failed to get status for pod" podUID="cc36acda-9447-479d-b741-c063ecb91f3e" pod="openshift-infra/auto-csr-approver-29536898-598km" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29536898-598km\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:47 crc kubenswrapper[4752]: I0227 17:40:47.711252 4752 status_manager.go:851] "Failed to get status for pod" podUID="899d1101-b4de-4326-b442-6450903b2a30" pod="openshift-marketplace/community-operators-b7x2z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-b7x2z\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:47 crc kubenswrapper[4752]: I0227 17:40:47.711753 4752 status_manager.go:851] "Failed to get status for pod" podUID="1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:47 crc kubenswrapper[4752]: I0227 17:40:47.712221 4752 status_manager.go:851] "Failed to get status for pod" podUID="4c7d2d1c-023b-43e1-9015-5b572f4648cf" pod="openshift-infra/auto-csr-approver-29536900-5b8dn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29536900-5b8dn\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:48 crc kubenswrapper[4752]: I0227 17:40:48.057401 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536900-5b8dn" Feb 27 17:40:48 crc kubenswrapper[4752]: I0227 17:40:48.058045 4752 status_manager.go:851] "Failed to get status for pod" podUID="1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:48 crc kubenswrapper[4752]: I0227 17:40:48.058576 4752 status_manager.go:851] "Failed to get status for pod" podUID="4c7d2d1c-023b-43e1-9015-5b572f4648cf" pod="openshift-infra/auto-csr-approver-29536900-5b8dn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29536900-5b8dn\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:48 crc kubenswrapper[4752]: I0227 17:40:48.058804 4752 status_manager.go:851] "Failed to get status for pod" podUID="760298d8-7405-4c9e-b322-b08dbc182da8" pod="openshift-marketplace/redhat-marketplace-zhhxr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-zhhxr\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:48 crc kubenswrapper[4752]: I0227 17:40:48.058989 4752 status_manager.go:851] "Failed to get status for pod" podUID="570355a7-01e2-4064-ae41-f43709dc4b88" pod="openshift-route-controller-manager/route-controller-manager-764d847c9b-pcf5d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-764d847c9b-pcf5d\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:48 crc kubenswrapper[4752]: I0227 17:40:48.059283 4752 status_manager.go:851] "Failed to get status for pod" podUID="cad177e6-5ee1-4884-bb19-b9413b183acc" pod="openshift-marketplace/community-operators-dm9bt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-dm9bt\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:48 crc kubenswrapper[4752]: I0227 17:40:48.059527 4752 status_manager.go:851] "Failed to get status for pod" podUID="78323811-0abf-4cc6-921c-5d0e56e895a3" pod="openshift-marketplace/redhat-marketplace-6kwhk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6kwhk\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:48 crc kubenswrapper[4752]: I0227 17:40:48.059777 4752 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:48 crc kubenswrapper[4752]: I0227 17:40:48.060034 4752 status_manager.go:851] "Failed to get status for pod" podUID="137184c7-4f82-4685-89fa-d5152358e216" pod="openshift-marketplace/redhat-operators-qvj4t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qvj4t\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:48 crc kubenswrapper[4752]: I0227 17:40:48.060406 4752 status_manager.go:851] "Failed to get status for pod" podUID="cc36acda-9447-479d-b741-c063ecb91f3e" pod="openshift-infra/auto-csr-approver-29536898-598km" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29536898-598km\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:48 crc kubenswrapper[4752]: I0227 17:40:48.060724 4752 status_manager.go:851] "Failed to get status for pod" podUID="899d1101-b4de-4326-b442-6450903b2a30" pod="openshift-marketplace/community-operators-b7x2z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-b7x2z\": dial tcp 38.102.83.102:6443: connect: connection refused" Feb 27 17:40:48 crc kubenswrapper[4752]: I0227 17:40:48.250201 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pz6hg\" (UniqueName: \"kubernetes.io/projected/4c7d2d1c-023b-43e1-9015-5b572f4648cf-kube-api-access-pz6hg\") pod \"4c7d2d1c-023b-43e1-9015-5b572f4648cf\" (UID: \"4c7d2d1c-023b-43e1-9015-5b572f4648cf\") " Feb 27 17:40:48 crc kubenswrapper[4752]: I0227 17:40:48.260307 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c7d2d1c-023b-43e1-9015-5b572f4648cf-kube-api-access-pz6hg" (OuterVolumeSpecName: "kube-api-access-pz6hg") pod "4c7d2d1c-023b-43e1-9015-5b572f4648cf" (UID: "4c7d2d1c-023b-43e1-9015-5b572f4648cf"). InnerVolumeSpecName "kube-api-access-pz6hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:40:48 crc kubenswrapper[4752]: I0227 17:40:48.352238 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pz6hg\" (UniqueName: \"kubernetes.io/projected/4c7d2d1c-023b-43e1-9015-5b572f4648cf-kube-api-access-pz6hg\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:48 crc kubenswrapper[4752]: I0227 17:40:48.715888 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536900-5b8dn" event={"ID":"4c7d2d1c-023b-43e1-9015-5b572f4648cf","Type":"ContainerDied","Data":"beae0dafa17616e4e47a6230d0e68a21646412fdc5ad115c0b00792631002ea3"} Feb 27 17:40:48 crc kubenswrapper[4752]: I0227 17:40:48.716193 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="beae0dafa17616e4e47a6230d0e68a21646412fdc5ad115c0b00792631002ea3" Feb 27 17:40:48 crc kubenswrapper[4752]: I0227 17:40:48.716255 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536900-5b8dn" Feb 27 17:40:48 crc kubenswrapper[4752]: I0227 17:40:48.719898 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 27 17:40:48 crc kubenswrapper[4752]: I0227 17:40:48.720324 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 27 17:40:48 crc kubenswrapper[4752]: I0227 17:40:48.720390 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"baef1f6e280c915b001f7a0379f2968e57cef4f145842aca2033b42379dcf233"} Feb 27 17:40:48 crc kubenswrapper[4752]: I0227 17:40:48.724240 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bd0fbc9342f1c60db6584894f3e23ffda580565ce774457df1ffabd1ce29479c"} Feb 27 17:40:48 crc kubenswrapper[4752]: I0227 17:40:48.724285 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"74c5807c900440d34e91ee55bba9b7d0ce327ec7ffb6cb23b4d423e89c3e51f3"} Feb 27 17:40:48 crc kubenswrapper[4752]: I0227 17:40:48.724297 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6a5a4bbe71be9d1225682fb9fcbed43e1f2607a5fc69645bd5f797f32f7d9a53"} Feb 27 17:40:49 crc kubenswrapper[4752]: I0227 17:40:49.595784 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" podUID="02863d54-8b48-4358-8dfe-b43269b1da31" containerName="oauth-openshift" containerID="cri-o://7cbe56d71fe50b73632fd26e4043862fc69a76c95cc6f3c35d7d72e1309f2c57" gracePeriod=15 Feb 27 17:40:49 crc kubenswrapper[4752]: I0227 17:40:49.739588 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"24ec58b2ca5068e0b5029d047ab5e28feb30e2082fa41bdb446fbb541a846a7d"} Feb 27 17:40:49 crc kubenswrapper[4752]: I0227 17:40:49.739631 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"63d520610233d252ef34b019a0619351a7855a616484c75937fb374949e6f92f"} Feb 27 17:40:49 crc kubenswrapper[4752]: I0227 17:40:49.739870 4752 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d1024425-74cb-401d-961a-72058a77a919" Feb 27 17:40:49 crc kubenswrapper[4752]: I0227 17:40:49.739884 4752 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d1024425-74cb-401d-961a-72058a77a919" Feb 27 17:40:49 crc kubenswrapper[4752]: I0227 17:40:49.740080 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:40:49 crc kubenswrapper[4752]: I0227 17:40:49.742714 4752 generic.go:334] "Generic (PLEG): container finished" podID="02863d54-8b48-4358-8dfe-b43269b1da31" containerID="7cbe56d71fe50b73632fd26e4043862fc69a76c95cc6f3c35d7d72e1309f2c57" exitCode=0 Feb 27 17:40:49 crc kubenswrapper[4752]: I0227 17:40:49.742740 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" event={"ID":"02863d54-8b48-4358-8dfe-b43269b1da31","Type":"ContainerDied","Data":"7cbe56d71fe50b73632fd26e4043862fc69a76c95cc6f3c35d7d72e1309f2c57"} Feb 27 17:40:49 crc kubenswrapper[4752]: E0227 17:40:49.910281 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536898-598km" podUID="cc36acda-9447-479d-b741-c063ecb91f3e" Feb 27 17:40:49 crc kubenswrapper[4752]: I0227 17:40:49.974506 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.078267 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/02863d54-8b48-4358-8dfe-b43269b1da31-audit-dir\") pod \"02863d54-8b48-4358-8dfe-b43269b1da31\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.078628 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-cliconfig\") pod \"02863d54-8b48-4358-8dfe-b43269b1da31\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.078769 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-user-template-provider-selection\") pod \"02863d54-8b48-4358-8dfe-b43269b1da31\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.078903 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-ocp-branding-template\") pod \"02863d54-8b48-4358-8dfe-b43269b1da31\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.079078 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-serving-cert\") pod \"02863d54-8b48-4358-8dfe-b43269b1da31\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.079233 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-user-template-login\") pod \"02863d54-8b48-4358-8dfe-b43269b1da31\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.079406 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-user-template-error\") pod \"02863d54-8b48-4358-8dfe-b43269b1da31\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.078393 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02863d54-8b48-4358-8dfe-b43269b1da31-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "02863d54-8b48-4358-8dfe-b43269b1da31" (UID: "02863d54-8b48-4358-8dfe-b43269b1da31"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.079510 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "02863d54-8b48-4358-8dfe-b43269b1da31" (UID: "02863d54-8b48-4358-8dfe-b43269b1da31"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.079532 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-service-ca\") pod \"02863d54-8b48-4358-8dfe-b43269b1da31\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.079592 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/02863d54-8b48-4358-8dfe-b43269b1da31-audit-policies\") pod \"02863d54-8b48-4358-8dfe-b43269b1da31\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.079623 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-trusted-ca-bundle\") pod \"02863d54-8b48-4358-8dfe-b43269b1da31\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.079653 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-user-idp-0-file-data\") pod \"02863d54-8b48-4358-8dfe-b43269b1da31\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.079682 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-router-certs\") pod \"02863d54-8b48-4358-8dfe-b43269b1da31\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.079720 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lmpr\" (UniqueName: \"kubernetes.io/projected/02863d54-8b48-4358-8dfe-b43269b1da31-kube-api-access-2lmpr\") pod \"02863d54-8b48-4358-8dfe-b43269b1da31\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.079762 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-session\") pod \"02863d54-8b48-4358-8dfe-b43269b1da31\" (UID: \"02863d54-8b48-4358-8dfe-b43269b1da31\") " Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.080056 4752 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/02863d54-8b48-4358-8dfe-b43269b1da31-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.080078 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.080834 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "02863d54-8b48-4358-8dfe-b43269b1da31" (UID: "02863d54-8b48-4358-8dfe-b43269b1da31"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.080938 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02863d54-8b48-4358-8dfe-b43269b1da31-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "02863d54-8b48-4358-8dfe-b43269b1da31" (UID: "02863d54-8b48-4358-8dfe-b43269b1da31"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.080947 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "02863d54-8b48-4358-8dfe-b43269b1da31" (UID: "02863d54-8b48-4358-8dfe-b43269b1da31"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.085590 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "02863d54-8b48-4358-8dfe-b43269b1da31" (UID: "02863d54-8b48-4358-8dfe-b43269b1da31"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.086065 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "02863d54-8b48-4358-8dfe-b43269b1da31" (UID: "02863d54-8b48-4358-8dfe-b43269b1da31"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.087879 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "02863d54-8b48-4358-8dfe-b43269b1da31" (UID: "02863d54-8b48-4358-8dfe-b43269b1da31"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.088676 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "02863d54-8b48-4358-8dfe-b43269b1da31" (UID: "02863d54-8b48-4358-8dfe-b43269b1da31"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.088778 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02863d54-8b48-4358-8dfe-b43269b1da31-kube-api-access-2lmpr" (OuterVolumeSpecName: "kube-api-access-2lmpr") pod "02863d54-8b48-4358-8dfe-b43269b1da31" (UID: "02863d54-8b48-4358-8dfe-b43269b1da31"). InnerVolumeSpecName "kube-api-access-2lmpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.089106 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "02863d54-8b48-4358-8dfe-b43269b1da31" (UID: "02863d54-8b48-4358-8dfe-b43269b1da31"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.100395 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "02863d54-8b48-4358-8dfe-b43269b1da31" (UID: "02863d54-8b48-4358-8dfe-b43269b1da31"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.101574 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "02863d54-8b48-4358-8dfe-b43269b1da31" (UID: "02863d54-8b48-4358-8dfe-b43269b1da31"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.102960 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "02863d54-8b48-4358-8dfe-b43269b1da31" (UID: "02863d54-8b48-4358-8dfe-b43269b1da31"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.181172 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.181255 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.181278 4752 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/02863d54-8b48-4358-8dfe-b43269b1da31-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.181299 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.181323 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.181342 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.181363 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lmpr\" (UniqueName: \"kubernetes.io/projected/02863d54-8b48-4358-8dfe-b43269b1da31-kube-api-access-2lmpr\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.181381 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.181400 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.181420 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.181439 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.181456 4752 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/02863d54-8b48-4358-8dfe-b43269b1da31-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.751185 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" event={"ID":"02863d54-8b48-4358-8dfe-b43269b1da31","Type":"ContainerDied","Data":"45cb998762850c7c93f48207a452b65e58961e1752d31a6eef93626437a10b6f"} Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.751241 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8r7pq" Feb 27 17:40:50 crc kubenswrapper[4752]: I0227 17:40:50.751263 4752 scope.go:117] "RemoveContainer" containerID="7cbe56d71fe50b73632fd26e4043862fc69a76c95cc6f3c35d7d72e1309f2c57" Feb 27 17:40:51 crc kubenswrapper[4752]: I0227 17:40:51.124867 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b7x2z" Feb 27 17:40:51 crc kubenswrapper[4752]: I0227 17:40:51.124937 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b7x2z" Feb 27 17:40:51 crc kubenswrapper[4752]: I0227 17:40:51.178430 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b7x2z" Feb 27 17:40:51 crc kubenswrapper[4752]: I0227 17:40:51.799819 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b7x2z" Feb 27 17:40:51 crc kubenswrapper[4752]: I0227 17:40:51.934942 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:40:51 crc kubenswrapper[4752]: I0227 17:40:51.935078 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:40:51 crc kubenswrapper[4752]: I0227 17:40:51.944686 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:40:54 crc kubenswrapper[4752]: I0227 17:40:54.756650 4752 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:40:54 crc kubenswrapper[4752]: I0227 17:40:54.781575 4752 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d1024425-74cb-401d-961a-72058a77a919" Feb 27 17:40:54 crc kubenswrapper[4752]: I0227 17:40:54.781614 4752 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d1024425-74cb-401d-961a-72058a77a919" Feb 27 17:40:54 crc kubenswrapper[4752]: I0227 17:40:54.787367 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:40:54 crc kubenswrapper[4752]: I0227 17:40:54.856582 4752 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="26e84f5f-4241-42f3-9e74-d1bcb9577b9a" Feb 27 17:40:54 crc kubenswrapper[4752]: I0227 17:40:54.996496 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 17:40:55 crc kubenswrapper[4752]: E0227 17:40:55.016137 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-6kwhk" podUID="78323811-0abf-4cc6-921c-5d0e56e895a3" Feb 27 17:40:55 crc kubenswrapper[4752]: E0227 17:40:55.079116 4752 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-service-ca\": Failed to watch *v1.ConfigMap: unknown (get configmaps)" logger="UnhandledError" Feb 27 17:40:55 crc kubenswrapper[4752]: E0227 17:40:55.229432 4752 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-router-certs\": Failed to watch *v1.Secret: unknown (get secrets)" logger="UnhandledError" Feb 27 17:40:55 crc kubenswrapper[4752]: E0227 17:40:55.392553 4752 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-trusted-ca-bundle\": Failed to watch *v1.ConfigMap: unknown (get configmaps)" logger="UnhandledError" Feb 27 17:40:55 crc kubenswrapper[4752]: I0227 17:40:55.786680 4752 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d1024425-74cb-401d-961a-72058a77a919" Feb 27 17:40:55 crc kubenswrapper[4752]: I0227 17:40:55.786726 4752 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d1024425-74cb-401d-961a-72058a77a919" Feb 27 17:40:55 crc kubenswrapper[4752]: I0227 17:40:55.789397 4752 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="26e84f5f-4241-42f3-9e74-d1bcb9577b9a" Feb 27 17:40:56 crc kubenswrapper[4752]: I0227 17:40:56.999658 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 17:40:57 crc kubenswrapper[4752]: I0227 17:40:57.000123 4752 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 27 17:40:57 crc kubenswrapper[4752]: I0227 17:40:57.001118 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 27 17:41:01 crc kubenswrapper[4752]: E0227 17:41:01.912020 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-zhhxr" podUID="760298d8-7405-4c9e-b322-b08dbc182da8" Feb 27 17:41:04 crc kubenswrapper[4752]: I0227 17:41:04.420795 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 27 17:41:04 crc kubenswrapper[4752]: I0227 17:41:04.627632 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 27 17:41:04 crc kubenswrapper[4752]: I0227 17:41:04.850977 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 27 17:41:05 crc kubenswrapper[4752]: I0227 17:41:05.043777 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 27 17:41:05 crc kubenswrapper[4752]: I0227 17:41:05.053911 4752 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 27 17:41:05 crc kubenswrapper[4752]: I0227 17:41:05.245061 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 27 17:41:05 crc kubenswrapper[4752]: I0227 17:41:05.254179 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 27 17:41:05 crc kubenswrapper[4752]: I0227 17:41:05.327654 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 27 17:41:05 crc kubenswrapper[4752]: I0227 17:41:05.334181 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 27 17:41:05 crc kubenswrapper[4752]: I0227 17:41:05.575815 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 27 17:41:05 crc kubenswrapper[4752]: I0227 17:41:05.701830 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 27 17:41:05 crc kubenswrapper[4752]: I0227 17:41:05.861046 4752 generic.go:334] "Generic (PLEG): container finished" podID="cc36acda-9447-479d-b741-c063ecb91f3e" containerID="ac44873d4f7c3d3f68400f980f54120965c4903584aae35e308bc7e3fc7a8c4b" exitCode=0 Feb 27 17:41:05 crc kubenswrapper[4752]: I0227 17:41:05.861111 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536898-598km" event={"ID":"cc36acda-9447-479d-b741-c063ecb91f3e","Type":"ContainerDied","Data":"ac44873d4f7c3d3f68400f980f54120965c4903584aae35e308bc7e3fc7a8c4b"} Feb 27 17:41:06 crc kubenswrapper[4752]: I0227 17:41:06.256008 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 27 17:41:06 crc kubenswrapper[4752]: I0227 17:41:06.553277 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 27 17:41:06 crc kubenswrapper[4752]: I0227 17:41:06.767306 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 27 17:41:06 crc kubenswrapper[4752]: I0227 17:41:06.781116 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 27 17:41:06 crc kubenswrapper[4752]: I0227 17:41:06.835848 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 27 17:41:06 crc kubenswrapper[4752]: I0227 17:41:06.957630 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 27 17:41:06 crc kubenswrapper[4752]: I0227 17:41:06.975124 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 27 17:41:06 crc kubenswrapper[4752]: I0227 17:41:06.999925 4752 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 27 17:41:07 crc kubenswrapper[4752]: I0227 17:41:07.000061 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 27 17:41:07 crc kubenswrapper[4752]: I0227 17:41:07.129207 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 27 17:41:07 crc kubenswrapper[4752]: I0227 17:41:07.162658 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 27 17:41:07 crc kubenswrapper[4752]: I0227 17:41:07.206124 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536898-598km" Feb 27 17:41:07 crc kubenswrapper[4752]: I0227 17:41:07.244696 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 27 17:41:07 crc kubenswrapper[4752]: I0227 17:41:07.248419 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 27 17:41:07 crc kubenswrapper[4752]: I0227 17:41:07.340457 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 27 17:41:07 crc kubenswrapper[4752]: I0227 17:41:07.374245 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9djm\" (UniqueName: \"kubernetes.io/projected/cc36acda-9447-479d-b741-c063ecb91f3e-kube-api-access-r9djm\") pod \"cc36acda-9447-479d-b741-c063ecb91f3e\" (UID: \"cc36acda-9447-479d-b741-c063ecb91f3e\") " Feb 27 17:41:07 crc kubenswrapper[4752]: I0227 17:41:07.380635 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc36acda-9447-479d-b741-c063ecb91f3e-kube-api-access-r9djm" (OuterVolumeSpecName: "kube-api-access-r9djm") pod "cc36acda-9447-479d-b741-c063ecb91f3e" (UID: "cc36acda-9447-479d-b741-c063ecb91f3e"). InnerVolumeSpecName "kube-api-access-r9djm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:41:07 crc kubenswrapper[4752]: I0227 17:41:07.476015 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9djm\" (UniqueName: \"kubernetes.io/projected/cc36acda-9447-479d-b741-c063ecb91f3e-kube-api-access-r9djm\") on node \"crc\" DevicePath \"\"" Feb 27 17:41:07 crc kubenswrapper[4752]: I0227 17:41:07.507284 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 27 17:41:07 crc kubenswrapper[4752]: I0227 17:41:07.513665 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 27 17:41:07 crc kubenswrapper[4752]: I0227 17:41:07.538465 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 27 17:41:07 crc kubenswrapper[4752]: I0227 17:41:07.567707 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 27 17:41:07 crc kubenswrapper[4752]: I0227 17:41:07.655928 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 27 17:41:07 crc kubenswrapper[4752]: I0227 17:41:07.679979 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 27 17:41:07 crc kubenswrapper[4752]: I0227 17:41:07.870103 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 27 17:41:07 crc kubenswrapper[4752]: I0227 17:41:07.876174 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536898-598km" event={"ID":"cc36acda-9447-479d-b741-c063ecb91f3e","Type":"ContainerDied","Data":"d9e5e636fa97f4b425a771395cec6ca9c8622bead9a512d3f403b668a9044791"} Feb 27 17:41:07 crc kubenswrapper[4752]: I0227 17:41:07.876231 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536898-598km" Feb 27 17:41:07 crc kubenswrapper[4752]: I0227 17:41:07.876245 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9e5e636fa97f4b425a771395cec6ca9c8622bead9a512d3f403b668a9044791" Feb 27 17:41:07 crc kubenswrapper[4752]: I0227 17:41:07.893612 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 27 17:41:07 crc kubenswrapper[4752]: I0227 17:41:07.906888 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 27 17:41:07 crc kubenswrapper[4752]: I0227 17:41:07.935644 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 27 17:41:07 crc kubenswrapper[4752]: I0227 17:41:07.941485 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 27 17:41:07 crc kubenswrapper[4752]: I0227 17:41:07.982923 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 27 17:41:07 crc kubenswrapper[4752]: I0227 17:41:07.996988 4752 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 27 17:41:08 crc kubenswrapper[4752]: I0227 17:41:08.061012 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 27 17:41:08 crc kubenswrapper[4752]: I0227 17:41:08.179440 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 27 17:41:08 crc kubenswrapper[4752]: I0227 17:41:08.185909 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 27 17:41:08 crc kubenswrapper[4752]: I0227 17:41:08.238455 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 27 17:41:08 crc kubenswrapper[4752]: I0227 17:41:08.393993 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 27 17:41:08 crc kubenswrapper[4752]: I0227 17:41:08.395015 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 27 17:41:08 crc kubenswrapper[4752]: I0227 17:41:08.452322 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 27 17:41:08 crc kubenswrapper[4752]: I0227 17:41:08.680620 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 27 17:41:08 crc kubenswrapper[4752]: I0227 17:41:08.783231 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 27 17:41:08 crc kubenswrapper[4752]: I0227 17:41:08.812883 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 27 17:41:09 crc kubenswrapper[4752]: I0227 17:41:09.090769 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 27 17:41:09 crc kubenswrapper[4752]: I0227 17:41:09.199953 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 27 17:41:09 crc kubenswrapper[4752]: I0227 17:41:09.204753 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 27 17:41:09 crc kubenswrapper[4752]: I0227 17:41:09.276115 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 27 17:41:09 crc kubenswrapper[4752]: I0227 17:41:09.299969 4752 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 27 17:41:09 crc kubenswrapper[4752]: I0227 17:41:09.305804 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 27 17:41:09 crc kubenswrapper[4752]: I0227 17:41:09.323957 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 27 17:41:09 crc kubenswrapper[4752]: I0227 17:41:09.360559 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 27 17:41:09 crc kubenswrapper[4752]: I0227 17:41:09.400732 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 27 17:41:09 crc kubenswrapper[4752]: I0227 17:41:09.426092 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 27 17:41:09 crc kubenswrapper[4752]: I0227 17:41:09.487093 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 27 17:41:09 crc kubenswrapper[4752]: I0227 17:41:09.503648 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 27 17:41:09 crc kubenswrapper[4752]: I0227 17:41:09.633329 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 27 17:41:09 crc kubenswrapper[4752]: I0227 17:41:09.725631 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 27 17:41:09 crc kubenswrapper[4752]: I0227 17:41:09.808332 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 27 17:41:09 crc kubenswrapper[4752]: I0227 17:41:09.819897 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 27 17:41:09 crc kubenswrapper[4752]: I0227 17:41:09.827281 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 27 17:41:09 crc kubenswrapper[4752]: I0227 17:41:09.838805 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 27 17:41:09 crc kubenswrapper[4752]: I0227 17:41:09.844835 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 27 17:41:09 crc kubenswrapper[4752]: I0227 17:41:09.856208 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 27 17:41:09 crc kubenswrapper[4752]: I0227 17:41:09.960947 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 27 17:41:10 crc kubenswrapper[4752]: I0227 17:41:10.023072 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 27 17:41:10 crc kubenswrapper[4752]: I0227 17:41:10.035815 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 27 17:41:10 crc kubenswrapper[4752]: I0227 17:41:10.043383 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 27 17:41:10 crc kubenswrapper[4752]: I0227 17:41:10.090517 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 27 17:41:10 crc kubenswrapper[4752]: I0227 17:41:10.138769 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 27 17:41:10 crc kubenswrapper[4752]: I0227 17:41:10.161362 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 27 17:41:10 crc kubenswrapper[4752]: I0227 17:41:10.201645 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 27 17:41:10 crc kubenswrapper[4752]: I0227 17:41:10.299655 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 27 17:41:10 crc kubenswrapper[4752]: I0227 17:41:10.346208 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 27 17:41:10 crc kubenswrapper[4752]: I0227 17:41:10.432420 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 27 17:41:10 crc kubenswrapper[4752]: I0227 17:41:10.439480 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 27 17:41:10 crc kubenswrapper[4752]: I0227 17:41:10.480798 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 27 17:41:10 crc kubenswrapper[4752]: I0227 17:41:10.497486 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 27 17:41:10 crc kubenswrapper[4752]: I0227 17:41:10.622126 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 27 17:41:10 crc kubenswrapper[4752]: I0227 17:41:10.634279 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 27 17:41:10 crc kubenswrapper[4752]: I0227 17:41:10.732681 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 27 17:41:10 crc kubenswrapper[4752]: I0227 17:41:10.791115 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 27 17:41:10 crc kubenswrapper[4752]: I0227 17:41:10.972635 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 27 17:41:11 crc kubenswrapper[4752]: I0227 17:41:11.013631 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 27 17:41:11 crc kubenswrapper[4752]: I0227 17:41:11.041225 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 27 17:41:11 crc kubenswrapper[4752]: I0227 17:41:11.126023 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 27 17:41:11 crc kubenswrapper[4752]: I0227 17:41:11.132537 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 27 17:41:11 crc kubenswrapper[4752]: I0227 17:41:11.162393 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 27 17:41:11 crc kubenswrapper[4752]: I0227 17:41:11.192346 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 27 17:41:11 crc kubenswrapper[4752]: I0227 17:41:11.259214 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 27 17:41:11 crc kubenswrapper[4752]: I0227 17:41:11.293420 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 27 17:41:11 crc kubenswrapper[4752]: I0227 17:41:11.336891 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 27 17:41:11 crc kubenswrapper[4752]: I0227 17:41:11.480185 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 27 17:41:11 crc kubenswrapper[4752]: I0227 17:41:11.547044 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 27 17:41:11 crc kubenswrapper[4752]: I0227 17:41:11.584242 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 27 17:41:11 crc kubenswrapper[4752]: I0227 17:41:11.635810 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 27 17:41:11 crc kubenswrapper[4752]: I0227 17:41:11.646907 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 27 17:41:11 crc kubenswrapper[4752]: I0227 17:41:11.663086 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 27 17:41:11 crc kubenswrapper[4752]: I0227 17:41:11.679363 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 27 17:41:11 crc kubenswrapper[4752]: I0227 17:41:11.688390 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 27 17:41:11 crc kubenswrapper[4752]: I0227 17:41:11.779606 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 27 17:41:11 crc kubenswrapper[4752]: I0227 17:41:11.780507 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 27 17:41:11 crc kubenswrapper[4752]: I0227 17:41:11.816525 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 27 17:41:11 crc kubenswrapper[4752]: I0227 17:41:11.829333 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 27 17:41:11 crc kubenswrapper[4752]: I0227 17:41:11.853951 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 27 17:41:11 crc kubenswrapper[4752]: I0227 17:41:11.856642 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 27 17:41:11 crc kubenswrapper[4752]: I0227 17:41:11.994931 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 27 17:41:12 crc kubenswrapper[4752]: I0227 17:41:12.096546 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 27 17:41:12 crc kubenswrapper[4752]: I0227 17:41:12.209369 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 27 17:41:12 crc kubenswrapper[4752]: I0227 17:41:12.280221 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 27 17:41:12 crc kubenswrapper[4752]: I0227 17:41:12.349230 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 27 17:41:12 crc kubenswrapper[4752]: I0227 17:41:12.352000 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 27 17:41:12 crc kubenswrapper[4752]: I0227 17:41:12.400799 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 27 17:41:12 crc kubenswrapper[4752]: I0227 17:41:12.454269 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 27 17:41:12 crc kubenswrapper[4752]: I0227 17:41:12.659190 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 27 17:41:12 crc kubenswrapper[4752]: I0227 17:41:12.795311 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 27 17:41:12 crc kubenswrapper[4752]: I0227 17:41:12.802875 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 27 17:41:12 crc kubenswrapper[4752]: I0227 17:41:12.926466 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 27 17:41:12 crc kubenswrapper[4752]: I0227 17:41:12.955876 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 27 17:41:12 crc kubenswrapper[4752]: I0227 17:41:12.970249 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 27 17:41:13 crc kubenswrapper[4752]: I0227 17:41:13.004948 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 27 17:41:13 crc kubenswrapper[4752]: I0227 17:41:13.058446 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 27 17:41:13 crc kubenswrapper[4752]: I0227 17:41:13.163752 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 27 17:41:13 crc kubenswrapper[4752]: I0227 17:41:13.207846 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 27 17:41:13 crc kubenswrapper[4752]: I0227 17:41:13.250121 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 27 17:41:13 crc kubenswrapper[4752]: I0227 17:41:13.313609 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 27 17:41:13 crc kubenswrapper[4752]: I0227 17:41:13.348810 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 27 17:41:13 crc kubenswrapper[4752]: I0227 17:41:13.499742 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 27 17:41:13 crc kubenswrapper[4752]: I0227 17:41:13.515178 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 27 17:41:13 crc kubenswrapper[4752]: I0227 17:41:13.587699 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 27 17:41:13 crc kubenswrapper[4752]: I0227 17:41:13.617071 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 27 17:41:13 crc kubenswrapper[4752]: I0227 17:41:13.712332 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 27 17:41:13 crc kubenswrapper[4752]: I0227 17:41:13.728012 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 27 17:41:13 crc kubenswrapper[4752]: I0227 17:41:13.778209 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 27 17:41:13 crc kubenswrapper[4752]: I0227 17:41:13.854473 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 27 17:41:13 crc kubenswrapper[4752]: I0227 17:41:13.859135 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 27 17:41:13 crc kubenswrapper[4752]: I0227 17:41:13.895422 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 27 17:41:13 crc kubenswrapper[4752]: I0227 17:41:13.907995 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 27 17:41:13 crc kubenswrapper[4752]: I0227 17:41:13.933803 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 27 17:41:13 crc kubenswrapper[4752]: I0227 17:41:13.934993 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.018735 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.025734 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.037231 4752 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.039177 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qvj4t" podStartSLOduration=42.855302585 podStartE2EDuration="2m1.039127961s" podCreationTimestamp="2026-02-27 17:39:13 +0000 UTC" firstStartedPulling="2026-02-27 17:39:14.778056198 +0000 UTC m=+254.684873039" lastFinishedPulling="2026-02-27 17:40:32.961881524 +0000 UTC m=+332.868698415" observedRunningTime="2026-02-27 17:40:54.632473979 +0000 UTC m=+354.539290840" watchObservedRunningTime="2026-02-27 17:41:14.039127961 +0000 UTC m=+373.945944822" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.039488 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dm9bt" podStartSLOduration=40.270145186 podStartE2EDuration="2m4.03948321s" podCreationTimestamp="2026-02-27 17:39:10 +0000 UTC" firstStartedPulling="2026-02-27 17:39:12.51943974 +0000 UTC m=+252.426256591" lastFinishedPulling="2026-02-27 17:40:36.288777754 +0000 UTC m=+336.195594615" observedRunningTime="2026-02-27 17:40:54.754711474 +0000 UTC m=+354.661528325" watchObservedRunningTime="2026-02-27 17:41:14.03948321 +0000 UTC m=+373.946300071" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.042030 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-764d847c9b-pcf5d" podStartSLOduration=46.042022752 podStartE2EDuration="46.042022752s" podCreationTimestamp="2026-02-27 17:40:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:40:54.733431522 +0000 UTC m=+354.640248383" watchObservedRunningTime="2026-02-27 17:41:14.042022752 +0000 UTC m=+373.948839613" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.042216 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b7x2z" podStartSLOduration=32.891111128 podStartE2EDuration="2m4.042206547s" podCreationTimestamp="2026-02-27 17:39:10 +0000 UTC" firstStartedPulling="2026-02-27 17:39:13.567024747 +0000 UTC m=+253.473841598" lastFinishedPulling="2026-02-27 17:40:44.718120166 +0000 UTC m=+344.624937017" observedRunningTime="2026-02-27 17:40:54.667486016 +0000 UTC m=+354.574302867" watchObservedRunningTime="2026-02-27 17:41:14.042206547 +0000 UTC m=+373.949023428" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.043082 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-8r7pq"] Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.043134 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-controller-manager/controller-manager-6498dd5cbd-6294j"] Feb 27 17:41:14 crc kubenswrapper[4752]: E0227 17:41:14.043372 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc36acda-9447-479d-b741-c063ecb91f3e" containerName="oc" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.043390 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc36acda-9447-479d-b741-c063ecb91f3e" containerName="oc" Feb 27 17:41:14 crc kubenswrapper[4752]: E0227 17:41:14.043409 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02863d54-8b48-4358-8dfe-b43269b1da31" containerName="oauth-openshift" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.043418 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="02863d54-8b48-4358-8dfe-b43269b1da31" containerName="oauth-openshift" Feb 27 17:41:14 crc kubenswrapper[4752]: E0227 17:41:14.043432 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba" containerName="installer" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.043441 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba" containerName="installer" Feb 27 17:41:14 crc kubenswrapper[4752]: E0227 17:41:14.043452 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c7d2d1c-023b-43e1-9015-5b572f4648cf" containerName="oc" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.043460 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c7d2d1c-023b-43e1-9015-5b572f4648cf" containerName="oc" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.043577 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c99ee9b-219f-4c6d-a5b3-f92e9f11ffba" containerName="installer" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.043591 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="02863d54-8b48-4358-8dfe-b43269b1da31" containerName="oauth-openshift" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.043603 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c7d2d1c-023b-43e1-9015-5b572f4648cf" containerName="oc" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.043617 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc36acda-9447-479d-b741-c063ecb91f3e" containerName="oc" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.044050 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6498dd5cbd-6294j" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.047389 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.047957 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.047885 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.048452 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.049677 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.050492 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.052401 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.058865 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.093669 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=20.093647417 podStartE2EDuration="20.093647417s" podCreationTimestamp="2026-02-27 17:40:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:41:14.091683499 +0000 UTC m=+373.998500380" watchObservedRunningTime="2026-02-27 17:41:14.093647417 +0000 UTC m=+374.000464278" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.107620 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.140758 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.147201 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.168192 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d798419-e59b-4970-bb64-7b9c71e0b36c-client-ca\") pod \"controller-manager-6498dd5cbd-6294j\" (UID: \"1d798419-e59b-4970-bb64-7b9c71e0b36c\") " pod="openshift-controller-manager/controller-manager-6498dd5cbd-6294j" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.168231 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d798419-e59b-4970-bb64-7b9c71e0b36c-serving-cert\") pod \"controller-manager-6498dd5cbd-6294j\" (UID: \"1d798419-e59b-4970-bb64-7b9c71e0b36c\") " pod="openshift-controller-manager/controller-manager-6498dd5cbd-6294j" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.168258 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1d798419-e59b-4970-bb64-7b9c71e0b36c-proxy-ca-bundles\") pod \"controller-manager-6498dd5cbd-6294j\" (UID: \"1d798419-e59b-4970-bb64-7b9c71e0b36c\") " pod="openshift-controller-manager/controller-manager-6498dd5cbd-6294j" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.168320 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wts2l\" (UniqueName: \"kubernetes.io/projected/1d798419-e59b-4970-bb64-7b9c71e0b36c-kube-api-access-wts2l\") pod \"controller-manager-6498dd5cbd-6294j\" (UID: \"1d798419-e59b-4970-bb64-7b9c71e0b36c\") " pod="openshift-controller-manager/controller-manager-6498dd5cbd-6294j" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.168360 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d798419-e59b-4970-bb64-7b9c71e0b36c-config\") pod \"controller-manager-6498dd5cbd-6294j\" (UID: \"1d798419-e59b-4970-bb64-7b9c71e0b36c\") " pod="openshift-controller-manager/controller-manager-6498dd5cbd-6294j" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.269519 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d798419-e59b-4970-bb64-7b9c71e0b36c-config\") pod \"controller-manager-6498dd5cbd-6294j\" (UID: \"1d798419-e59b-4970-bb64-7b9c71e0b36c\") " pod="openshift-controller-manager/controller-manager-6498dd5cbd-6294j" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.269602 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d798419-e59b-4970-bb64-7b9c71e0b36c-client-ca\") pod \"controller-manager-6498dd5cbd-6294j\" (UID: \"1d798419-e59b-4970-bb64-7b9c71e0b36c\") " pod="openshift-controller-manager/controller-manager-6498dd5cbd-6294j" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.269628 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d798419-e59b-4970-bb64-7b9c71e0b36c-serving-cert\") pod \"controller-manager-6498dd5cbd-6294j\" (UID: \"1d798419-e59b-4970-bb64-7b9c71e0b36c\") " pod="openshift-controller-manager/controller-manager-6498dd5cbd-6294j" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.269654 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1d798419-e59b-4970-bb64-7b9c71e0b36c-proxy-ca-bundles\") pod \"controller-manager-6498dd5cbd-6294j\" (UID: \"1d798419-e59b-4970-bb64-7b9c71e0b36c\") " pod="openshift-controller-manager/controller-manager-6498dd5cbd-6294j" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.269696 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wts2l\" (UniqueName: \"kubernetes.io/projected/1d798419-e59b-4970-bb64-7b9c71e0b36c-kube-api-access-wts2l\") pod \"controller-manager-6498dd5cbd-6294j\" (UID: \"1d798419-e59b-4970-bb64-7b9c71e0b36c\") " pod="openshift-controller-manager/controller-manager-6498dd5cbd-6294j" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.271221 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d798419-e59b-4970-bb64-7b9c71e0b36c-client-ca\") pod \"controller-manager-6498dd5cbd-6294j\" (UID: \"1d798419-e59b-4970-bb64-7b9c71e0b36c\") " pod="openshift-controller-manager/controller-manager-6498dd5cbd-6294j" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.271755 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1d798419-e59b-4970-bb64-7b9c71e0b36c-proxy-ca-bundles\") pod \"controller-manager-6498dd5cbd-6294j\" (UID: \"1d798419-e59b-4970-bb64-7b9c71e0b36c\") " pod="openshift-controller-manager/controller-manager-6498dd5cbd-6294j" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.274362 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d798419-e59b-4970-bb64-7b9c71e0b36c-config\") pod \"controller-manager-6498dd5cbd-6294j\" (UID: \"1d798419-e59b-4970-bb64-7b9c71e0b36c\") " pod="openshift-controller-manager/controller-manager-6498dd5cbd-6294j" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.283241 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.287180 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d798419-e59b-4970-bb64-7b9c71e0b36c-serving-cert\") pod \"controller-manager-6498dd5cbd-6294j\" (UID: \"1d798419-e59b-4970-bb64-7b9c71e0b36c\") " pod="openshift-controller-manager/controller-manager-6498dd5cbd-6294j" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.299339 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wts2l\" (UniqueName: \"kubernetes.io/projected/1d798419-e59b-4970-bb64-7b9c71e0b36c-kube-api-access-wts2l\") pod \"controller-manager-6498dd5cbd-6294j\" (UID: \"1d798419-e59b-4970-bb64-7b9c71e0b36c\") " pod="openshift-controller-manager/controller-manager-6498dd5cbd-6294j" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.361540 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6498dd5cbd-6294j" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.411697 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.480539 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.481281 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.505443 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.545307 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.613533 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.631178 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.738894 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.791925 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.810670 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.915655 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02863d54-8b48-4358-8dfe-b43269b1da31" path="/var/lib/kubelet/pods/02863d54-8b48-4358-8dfe-b43269b1da31/volumes" Feb 27 17:41:14 crc kubenswrapper[4752]: I0227 17:41:14.996535 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 27 17:41:15 crc kubenswrapper[4752]: I0227 17:41:15.219928 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 27 17:41:15 crc kubenswrapper[4752]: I0227 17:41:15.232230 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 27 17:41:15 crc kubenswrapper[4752]: I0227 17:41:15.337087 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 27 17:41:15 crc kubenswrapper[4752]: I0227 17:41:15.425332 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 27 17:41:15 crc kubenswrapper[4752]: I0227 17:41:15.518034 4752 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 27 17:41:15 crc kubenswrapper[4752]: I0227 17:41:15.524446 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 27 17:41:15 crc kubenswrapper[4752]: I0227 17:41:15.532320 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 27 17:41:15 crc kubenswrapper[4752]: I0227 17:41:15.608443 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 27 17:41:15 crc kubenswrapper[4752]: I0227 17:41:15.788797 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 27 17:41:15 crc kubenswrapper[4752]: I0227 17:41:15.800051 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 27 17:41:15 crc kubenswrapper[4752]: I0227 17:41:15.806954 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 27 17:41:15 crc kubenswrapper[4752]: I0227 17:41:15.819640 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 27 17:41:15 crc kubenswrapper[4752]: I0227 17:41:15.905916 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 27 17:41:15 crc kubenswrapper[4752]: I0227 17:41:15.948836 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 27 17:41:15 crc kubenswrapper[4752]: I0227 17:41:15.983268 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 27 17:41:16 crc kubenswrapper[4752]: I0227 17:41:16.011959 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 27 17:41:16 crc kubenswrapper[4752]: I0227 17:41:16.016708 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 27 17:41:16 crc kubenswrapper[4752]: I0227 17:41:16.037334 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 27 17:41:16 crc kubenswrapper[4752]: I0227 17:41:16.060360 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 27 17:41:16 crc kubenswrapper[4752]: I0227 17:41:16.100204 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 27 17:41:16 crc kubenswrapper[4752]: I0227 17:41:16.153860 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 27 17:41:16 crc kubenswrapper[4752]: I0227 17:41:16.205573 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 27 17:41:16 crc kubenswrapper[4752]: I0227 17:41:16.293201 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 27 17:41:16 crc kubenswrapper[4752]: I0227 17:41:16.312759 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 27 17:41:16 crc kubenswrapper[4752]: I0227 17:41:16.353771 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 27 17:41:16 crc kubenswrapper[4752]: I0227 17:41:16.409070 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 27 17:41:16 crc kubenswrapper[4752]: I0227 17:41:16.445237 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 27 17:41:16 crc kubenswrapper[4752]: I0227 17:41:16.451923 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 27 17:41:16 crc kubenswrapper[4752]: I0227 17:41:16.482477 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 27 17:41:16 crc kubenswrapper[4752]: I0227 17:41:16.654112 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 27 17:41:16 crc kubenswrapper[4752]: I0227 17:41:16.737319 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 27 17:41:16 crc kubenswrapper[4752]: I0227 17:41:16.752190 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 27 17:41:16 crc kubenswrapper[4752]: I0227 17:41:16.805981 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 27 17:41:17 crc kubenswrapper[4752]: I0227 17:41:17.000082 4752 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 27 17:41:17 crc kubenswrapper[4752]: I0227 17:41:17.000240 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 27 17:41:17 crc kubenswrapper[4752]: I0227 17:41:17.000334 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 17:41:17 crc kubenswrapper[4752]: I0227 17:41:17.001453 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"baef1f6e280c915b001f7a0379f2968e57cef4f145842aca2033b42379dcf233"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Feb 27 17:41:17 crc kubenswrapper[4752]: I0227 17:41:17.001675 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://baef1f6e280c915b001f7a0379f2968e57cef4f145842aca2033b42379dcf233" gracePeriod=30 Feb 27 17:41:17 crc kubenswrapper[4752]: I0227 17:41:17.066234 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 27 17:41:17 crc kubenswrapper[4752]: I0227 17:41:17.067758 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 27 17:41:17 crc kubenswrapper[4752]: I0227 17:41:17.278741 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 27 17:41:17 crc kubenswrapper[4752]: I0227 17:41:17.302838 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 27 17:41:17 crc kubenswrapper[4752]: I0227 17:41:17.313392 4752 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 27 17:41:17 crc kubenswrapper[4752]: I0227 17:41:17.313730 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://84c0b0a6827d2b9990e06b691d1ddf4609921756d5a3841c22e02554f89f4da7" gracePeriod=5 Feb 27 17:41:17 crc kubenswrapper[4752]: I0227 17:41:17.440328 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 27 17:41:17 crc kubenswrapper[4752]: I0227 17:41:17.548168 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 27 17:41:17 crc kubenswrapper[4752]: I0227 17:41:17.614338 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 27 17:41:17 crc kubenswrapper[4752]: I0227 17:41:17.664306 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 27 17:41:17 crc kubenswrapper[4752]: I0227 17:41:17.686740 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 27 17:41:17 crc kubenswrapper[4752]: I0227 17:41:17.724781 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 27 17:41:17 crc kubenswrapper[4752]: I0227 17:41:17.746988 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 27 17:41:17 crc kubenswrapper[4752]: I0227 17:41:17.750326 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 27 17:41:17 crc kubenswrapper[4752]: I0227 17:41:17.854739 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 27 17:41:17 crc kubenswrapper[4752]: I0227 17:41:17.972115 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 27 17:41:17 crc kubenswrapper[4752]: I0227 17:41:17.984004 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 27 17:41:18 crc kubenswrapper[4752]: I0227 17:41:18.071766 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 27 17:41:18 crc kubenswrapper[4752]: I0227 17:41:18.111734 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 27 17:41:18 crc kubenswrapper[4752]: I0227 17:41:18.212505 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 27 17:41:18 crc kubenswrapper[4752]: I0227 17:41:18.559517 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 27 17:41:18 crc kubenswrapper[4752]: I0227 17:41:18.571616 4752 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 27 17:41:18 crc kubenswrapper[4752]: I0227 17:41:18.607434 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 27 17:41:18 crc kubenswrapper[4752]: I0227 17:41:18.668380 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 27 17:41:18 crc kubenswrapper[4752]: I0227 17:41:18.678352 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 27 17:41:18 crc kubenswrapper[4752]: I0227 17:41:18.844310 4752 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 27 17:41:18 crc kubenswrapper[4752]: I0227 17:41:18.897733 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 27 17:41:18 crc kubenswrapper[4752]: I0227 17:41:18.952800 4752 generic.go:334] "Generic (PLEG): container finished" podID="760298d8-7405-4c9e-b322-b08dbc182da8" containerID="34c530e0969c24c2e99daec12a3da7d311bff2fb274f0c137017c469be50ba5f" exitCode=0 Feb 27 17:41:18 crc kubenswrapper[4752]: I0227 17:41:18.952842 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zhhxr" event={"ID":"760298d8-7405-4c9e-b322-b08dbc182da8","Type":"ContainerDied","Data":"34c530e0969c24c2e99daec12a3da7d311bff2fb274f0c137017c469be50ba5f"} Feb 27 17:41:18 crc kubenswrapper[4752]: I0227 17:41:18.977424 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 27 17:41:19 crc kubenswrapper[4752]: I0227 17:41:19.548135 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 27 17:41:19 crc kubenswrapper[4752]: I0227 17:41:19.580282 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 27 17:41:19 crc kubenswrapper[4752]: I0227 17:41:19.752849 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 27 17:41:19 crc kubenswrapper[4752]: I0227 17:41:19.768818 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 27 17:41:19 crc kubenswrapper[4752]: I0227 17:41:19.961305 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zhhxr" event={"ID":"760298d8-7405-4c9e-b322-b08dbc182da8","Type":"ContainerStarted","Data":"634f55a9c540561019fb4db92385729210f79b4f6236eee7b113aec7b8692aa9"} Feb 27 17:41:20 crc kubenswrapper[4752]: I0227 17:41:20.030568 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 27 17:41:20 crc kubenswrapper[4752]: I0227 17:41:20.254343 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zhhxr" podStartSLOduration=3.532610004 podStartE2EDuration="2m8.254325552s" podCreationTimestamp="2026-02-27 17:39:12 +0000 UTC" firstStartedPulling="2026-02-27 17:39:14.642668226 +0000 UTC m=+254.549485077" lastFinishedPulling="2026-02-27 17:41:19.364383774 +0000 UTC m=+379.271200625" observedRunningTime="2026-02-27 17:41:19.989493432 +0000 UTC m=+379.896310293" watchObservedRunningTime="2026-02-27 17:41:20.254325552 +0000 UTC m=+380.161142413" Feb 27 17:41:20 crc kubenswrapper[4752]: I0227 17:41:20.257916 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6498dd5cbd-6294j"] Feb 27 17:41:20 crc kubenswrapper[4752]: I0227 17:41:20.312359 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 27 17:41:20 crc kubenswrapper[4752]: I0227 17:41:20.393403 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 27 17:41:20 crc kubenswrapper[4752]: I0227 17:41:20.640404 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6498dd5cbd-6294j"] Feb 27 17:41:20 crc kubenswrapper[4752]: W0227 17:41:20.652549 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d798419_e59b_4970_bb64_7b9c71e0b36c.slice/crio-4d423676a3df9df7ee49da5f6adb12ff028c1376a1ff2b1de6b3f9cad8e9db30 WatchSource:0}: Error finding container 4d423676a3df9df7ee49da5f6adb12ff028c1376a1ff2b1de6b3f9cad8e9db30: Status 404 returned error can't find the container with id 4d423676a3df9df7ee49da5f6adb12ff028c1376a1ff2b1de6b3f9cad8e9db30 Feb 27 17:41:20 crc kubenswrapper[4752]: I0227 17:41:20.683402 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 27 17:41:20 crc kubenswrapper[4752]: I0227 17:41:20.968550 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6498dd5cbd-6294j" event={"ID":"1d798419-e59b-4970-bb64-7b9c71e0b36c","Type":"ContainerStarted","Data":"6d936d7b4a5e130eac005b5d2002d1b201a83e3302954df9caca45baa16a66e8"} Feb 27 17:41:20 crc kubenswrapper[4752]: I0227 17:41:20.968825 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6498dd5cbd-6294j" event={"ID":"1d798419-e59b-4970-bb64-7b9c71e0b36c","Type":"ContainerStarted","Data":"4d423676a3df9df7ee49da5f6adb12ff028c1376a1ff2b1de6b3f9cad8e9db30"} Feb 27 17:41:20 crc kubenswrapper[4752]: I0227 17:41:20.969214 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6498dd5cbd-6294j" Feb 27 17:41:20 crc kubenswrapper[4752]: I0227 17:41:20.974046 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6498dd5cbd-6294j" Feb 27 17:41:20 crc kubenswrapper[4752]: I0227 17:41:20.987174 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6498dd5cbd-6294j" podStartSLOduration=52.987161182 podStartE2EDuration="52.987161182s" podCreationTimestamp="2026-02-27 17:40:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:41:20.983033701 +0000 UTC m=+380.889850562" watchObservedRunningTime="2026-02-27 17:41:20.987161182 +0000 UTC m=+380.893978043" Feb 27 17:41:21 crc kubenswrapper[4752]: I0227 17:41:21.092495 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 27 17:41:21 crc kubenswrapper[4752]: I0227 17:41:21.431887 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 27 17:41:21 crc kubenswrapper[4752]: I0227 17:41:21.647807 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.416803 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7f445d97b7-ht8m5"] Feb 27 17:41:22 crc kubenswrapper[4752]: E0227 17:41:22.417321 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.417418 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.417597 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.418197 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:22 crc kubenswrapper[4752]: W0227 17:41:22.420925 4752 reflector.go:561] object-"openshift-authentication"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Feb 27 17:41:22 crc kubenswrapper[4752]: E0227 17:41:22.420968 4752 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 17:41:22 crc kubenswrapper[4752]: W0227 17:41:22.421404 4752 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle": failed to list *v1.ConfigMap: configmaps "v4-0-config-system-trusted-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Feb 27 17:41:22 crc kubenswrapper[4752]: E0227 17:41:22.421422 4752 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-trusted-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"v4-0-config-system-trusted-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 17:41:22 crc kubenswrapper[4752]: W0227 17:41:22.421551 4752 reflector.go:561] object-"openshift-authentication"/"v4-0-config-user-template-login": failed to list *v1.Secret: secrets "v4-0-config-user-template-login" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Feb 27 17:41:22 crc kubenswrapper[4752]: E0227 17:41:22.421565 4752 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-user-template-login\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-user-template-login\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 17:41:22 crc kubenswrapper[4752]: W0227 17:41:22.421857 4752 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template": failed to list *v1.Secret: secrets "v4-0-config-system-ocp-branding-template" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Feb 27 17:41:22 crc kubenswrapper[4752]: E0227 17:41:22.421872 4752 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-ocp-branding-template\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-system-ocp-branding-template\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 17:41:22 crc kubenswrapper[4752]: W0227 17:41:22.421904 4752 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-serving-cert": failed to list *v1.Secret: secrets "v4-0-config-system-serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Feb 27 17:41:22 crc kubenswrapper[4752]: E0227 17:41:22.421914 4752 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-system-serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 17:41:22 crc kubenswrapper[4752]: W0227 17:41:22.422011 4752 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-cliconfig": failed to list *v1.ConfigMap: configmaps "v4-0-config-system-cliconfig" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Feb 27 17:41:22 crc kubenswrapper[4752]: E0227 17:41:22.422023 4752 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-cliconfig\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"v4-0-config-system-cliconfig\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 17:41:22 crc kubenswrapper[4752]: W0227 17:41:22.422103 4752 reflector.go:561] object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc": failed to list *v1.Secret: secrets "oauth-openshift-dockercfg-znhcc" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Feb 27 17:41:22 crc kubenswrapper[4752]: E0227 17:41:22.422137 4752 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"oauth-openshift-dockercfg-znhcc\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"oauth-openshift-dockercfg-znhcc\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 17:41:22 crc kubenswrapper[4752]: W0227 17:41:22.422362 4752 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-service-ca": failed to list *v1.ConfigMap: configmaps "v4-0-config-system-service-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Feb 27 17:41:22 crc kubenswrapper[4752]: E0227 17:41:22.422377 4752 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-service-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"v4-0-config-system-service-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 17:41:22 crc kubenswrapper[4752]: W0227 17:41:22.422440 4752 reflector.go:561] object-"openshift-authentication"/"v4-0-config-user-template-provider-selection": failed to list *v1.Secret: secrets "v4-0-config-user-template-provider-selection" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Feb 27 17:41:22 crc kubenswrapper[4752]: E0227 17:41:22.422451 4752 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-user-template-provider-selection\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-user-template-provider-selection\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 17:41:22 crc kubenswrapper[4752]: W0227 17:41:22.422783 4752 reflector.go:561] object-"openshift-authentication"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Feb 27 17:41:22 crc kubenswrapper[4752]: E0227 17:41:22.422798 4752 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 17:41:22 crc kubenswrapper[4752]: W0227 17:41:22.422914 4752 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-session": failed to list *v1.Secret: secrets "v4-0-config-system-session" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Feb 27 17:41:22 crc kubenswrapper[4752]: E0227 17:41:22.422926 4752 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-session\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-system-session\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 17:41:22 crc kubenswrapper[4752]: W0227 17:41:22.422959 4752 reflector.go:561] object-"openshift-authentication"/"v4-0-config-user-template-error": failed to list *v1.Secret: secrets "v4-0-config-user-template-error" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Feb 27 17:41:22 crc kubenswrapper[4752]: E0227 17:41:22.422969 4752 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-user-template-error\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-user-template-error\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 17:41:22 crc kubenswrapper[4752]: W0227 17:41:22.423031 4752 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-router-certs": failed to list *v1.Secret: secrets "v4-0-config-system-router-certs" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Feb 27 17:41:22 crc kubenswrapper[4752]: E0227 17:41:22.423042 4752 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-router-certs\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-system-router-certs\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 17:41:22 crc kubenswrapper[4752]: W0227 17:41:22.423075 4752 reflector.go:561] object-"openshift-authentication"/"audit": failed to list *v1.ConfigMap: configmaps "audit" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Feb 27 17:41:22 crc kubenswrapper[4752]: E0227 17:41:22.423086 4752 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"audit\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"audit\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 17:41:22 crc kubenswrapper[4752]: W0227 17:41:22.423126 4752 reflector.go:561] object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data": failed to list *v1.Secret: secrets "v4-0-config-user-idp-0-file-data" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Feb 27 17:41:22 crc kubenswrapper[4752]: E0227 17:41:22.423135 4752 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-user-idp-0-file-data\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-user-idp-0-file-data\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.442401 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7f445d97b7-ht8m5"] Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.443133 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.443230 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.469946 4752 ???:1] "http: TLS handshake error from 192.168.126.11:34030: no serving certificate available for the kubelet" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.481717 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-system-service-ca\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.481750 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-system-session\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.481775 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-user-template-login\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.481889 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.481940 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.481965 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.482049 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-system-router-certs\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.482071 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.482094 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-user-template-error\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.482119 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1e8b5999-2184-44c0-8b0c-da9c76a82b41-audit-dir\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.482181 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.482241 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqgr9\" (UniqueName: \"kubernetes.io/projected/1e8b5999-2184-44c0-8b0c-da9c76a82b41-kube-api-access-dqgr9\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.482476 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1e8b5999-2184-44c0-8b0c-da9c76a82b41-audit-policies\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.482539 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.583874 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.583908 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.583966 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.583988 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.584004 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.583956 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.583993 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.584101 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqgr9\" (UniqueName: \"kubernetes.io/projected/1e8b5999-2184-44c0-8b0c-da9c76a82b41-kube-api-access-dqgr9\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.584125 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1e8b5999-2184-44c0-8b0c-da9c76a82b41-audit-policies\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.584165 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.584124 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.584199 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.584187 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-system-service-ca\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.584297 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-system-session\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.584357 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-user-template-login\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.584449 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.584514 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.584556 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.584676 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-system-router-certs\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.584740 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.584800 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-user-template-error\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.584856 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1e8b5999-2184-44c0-8b0c-da9c76a82b41-audit-dir\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.584909 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.584944 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1e8b5999-2184-44c0-8b0c-da9c76a82b41-audit-dir\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.585062 4752 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.585096 4752 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.585121 4752 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.585167 4752 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.596410 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.686663 4752 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.917611 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.982072 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.982192 4752 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="84c0b0a6827d2b9990e06b691d1ddf4609921756d5a3841c22e02554f89f4da7" exitCode=137 Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.982280 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 27 17:41:22 crc kubenswrapper[4752]: I0227 17:41:22.982343 4752 scope.go:117] "RemoveContainer" containerID="84c0b0a6827d2b9990e06b691d1ddf4609921756d5a3841c22e02554f89f4da7" Feb 27 17:41:23 crc kubenswrapper[4752]: I0227 17:41:23.005718 4752 scope.go:117] "RemoveContainer" containerID="84c0b0a6827d2b9990e06b691d1ddf4609921756d5a3841c22e02554f89f4da7" Feb 27 17:41:23 crc kubenswrapper[4752]: E0227 17:41:23.006257 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84c0b0a6827d2b9990e06b691d1ddf4609921756d5a3841c22e02554f89f4da7\": container with ID starting with 84c0b0a6827d2b9990e06b691d1ddf4609921756d5a3841c22e02554f89f4da7 not found: ID does not exist" containerID="84c0b0a6827d2b9990e06b691d1ddf4609921756d5a3841c22e02554f89f4da7" Feb 27 17:41:23 crc kubenswrapper[4752]: I0227 17:41:23.006314 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84c0b0a6827d2b9990e06b691d1ddf4609921756d5a3841c22e02554f89f4da7"} err="failed to get container status \"84c0b0a6827d2b9990e06b691d1ddf4609921756d5a3841c22e02554f89f4da7\": rpc error: code = NotFound desc = could not find container \"84c0b0a6827d2b9990e06b691d1ddf4609921756d5a3841c22e02554f89f4da7\": container with ID starting with 84c0b0a6827d2b9990e06b691d1ddf4609921756d5a3841c22e02554f89f4da7 not found: ID does not exist" Feb 27 17:41:23 crc kubenswrapper[4752]: I0227 17:41:23.148372 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zhhxr" Feb 27 17:41:23 crc kubenswrapper[4752]: I0227 17:41:23.149369 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zhhxr" Feb 27 17:41:23 crc kubenswrapper[4752]: I0227 17:41:23.211547 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zhhxr" Feb 27 17:41:23 crc kubenswrapper[4752]: I0227 17:41:23.276061 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 27 17:41:23 crc kubenswrapper[4752]: I0227 17:41:23.290464 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:23 crc kubenswrapper[4752]: I0227 17:41:23.324535 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 27 17:41:23 crc kubenswrapper[4752]: I0227 17:41:23.335914 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1e8b5999-2184-44c0-8b0c-da9c76a82b41-audit-policies\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:23 crc kubenswrapper[4752]: I0227 17:41:23.419001 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 27 17:41:23 crc kubenswrapper[4752]: I0227 17:41:23.425858 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:23 crc kubenswrapper[4752]: I0227 17:41:23.573434 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 27 17:41:23 crc kubenswrapper[4752]: I0227 17:41:23.579592 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:23 crc kubenswrapper[4752]: E0227 17:41:23.584398 4752 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-service-ca: failed to sync configmap cache: timed out waiting for the condition Feb 27 17:41:23 crc kubenswrapper[4752]: E0227 17:41:23.584511 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-system-service-ca podName:1e8b5999-2184-44c0-8b0c-da9c76a82b41 nodeName:}" failed. No retries permitted until 2026-02-27 17:41:24.084482912 +0000 UTC m=+383.991299803 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-service-ca" (UniqueName: "kubernetes.io/configmap/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-system-service-ca") pod "oauth-openshift-7f445d97b7-ht8m5" (UID: "1e8b5999-2184-44c0-8b0c-da9c76a82b41") : failed to sync configmap cache: timed out waiting for the condition Feb 27 17:41:23 crc kubenswrapper[4752]: E0227 17:41:23.585451 4752 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-system-router-certs: failed to sync secret cache: timed out waiting for the condition Feb 27 17:41:23 crc kubenswrapper[4752]: E0227 17:41:23.585539 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-system-router-certs podName:1e8b5999-2184-44c0-8b0c-da9c76a82b41 nodeName:}" failed. No retries permitted until 2026-02-27 17:41:24.085520428 +0000 UTC m=+383.992337289 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-router-certs" (UniqueName: "kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-system-router-certs") pod "oauth-openshift-7f445d97b7-ht8m5" (UID: "1e8b5999-2184-44c0-8b0c-da9c76a82b41") : failed to sync secret cache: timed out waiting for the condition Feb 27 17:41:23 crc kubenswrapper[4752]: E0227 17:41:23.585567 4752 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-user-template-login: failed to sync secret cache: timed out waiting for the condition Feb 27 17:41:23 crc kubenswrapper[4752]: E0227 17:41:23.585595 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-user-template-login podName:1e8b5999-2184-44c0-8b0c-da9c76a82b41 nodeName:}" failed. No retries permitted until 2026-02-27 17:41:24.085587809 +0000 UTC m=+383.992404670 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-user-template-login" (UniqueName: "kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-user-template-login") pod "oauth-openshift-7f445d97b7-ht8m5" (UID: "1e8b5999-2184-44c0-8b0c-da9c76a82b41") : failed to sync secret cache: timed out waiting for the condition Feb 27 17:41:23 crc kubenswrapper[4752]: E0227 17:41:23.585658 4752 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: failed to sync configmap cache: timed out waiting for the condition Feb 27 17:41:23 crc kubenswrapper[4752]: E0227 17:41:23.585700 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-system-cliconfig podName:1e8b5999-2184-44c0-8b0c-da9c76a82b41 nodeName:}" failed. No retries permitted until 2026-02-27 17:41:24.085691312 +0000 UTC m=+383.992508173 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-system-cliconfig") pod "oauth-openshift-7f445d97b7-ht8m5" (UID: "1e8b5999-2184-44c0-8b0c-da9c76a82b41") : failed to sync configmap cache: timed out waiting for the condition Feb 27 17:41:23 crc kubenswrapper[4752]: E0227 17:41:23.588019 4752 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-user-template-error: failed to sync secret cache: timed out waiting for the condition Feb 27 17:41:23 crc kubenswrapper[4752]: E0227 17:41:23.588085 4752 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-system-ocp-branding-template: failed to sync secret cache: timed out waiting for the condition Feb 27 17:41:23 crc kubenswrapper[4752]: E0227 17:41:23.588128 4752 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-system-session: failed to sync secret cache: timed out waiting for the condition Feb 27 17:41:23 crc kubenswrapper[4752]: E0227 17:41:23.588193 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-user-template-error podName:1e8b5999-2184-44c0-8b0c-da9c76a82b41 nodeName:}" failed. No retries permitted until 2026-02-27 17:41:24.088125022 +0000 UTC m=+383.994942063 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-user-template-error" (UniqueName: "kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-user-template-error") pod "oauth-openshift-7f445d97b7-ht8m5" (UID: "1e8b5999-2184-44c0-8b0c-da9c76a82b41") : failed to sync secret cache: timed out waiting for the condition Feb 27 17:41:23 crc kubenswrapper[4752]: E0227 17:41:23.588037 4752 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-user-template-provider-selection: failed to sync secret cache: timed out waiting for the condition Feb 27 17:41:23 crc kubenswrapper[4752]: E0227 17:41:23.588251 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-system-ocp-branding-template podName:1e8b5999-2184-44c0-8b0c-da9c76a82b41 nodeName:}" failed. No retries permitted until 2026-02-27 17:41:24.088217734 +0000 UTC m=+383.995034795 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-ocp-branding-template" (UniqueName: "kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-system-ocp-branding-template") pod "oauth-openshift-7f445d97b7-ht8m5" (UID: "1e8b5999-2184-44c0-8b0c-da9c76a82b41") : failed to sync secret cache: timed out waiting for the condition Feb 27 17:41:23 crc kubenswrapper[4752]: E0227 17:41:23.588319 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-system-session podName:1e8b5999-2184-44c0-8b0c-da9c76a82b41 nodeName:}" failed. No retries permitted until 2026-02-27 17:41:24.088299526 +0000 UTC m=+383.995116657 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-session" (UniqueName: "kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-system-session") pod "oauth-openshift-7f445d97b7-ht8m5" (UID: "1e8b5999-2184-44c0-8b0c-da9c76a82b41") : failed to sync secret cache: timed out waiting for the condition Feb 27 17:41:23 crc kubenswrapper[4752]: E0227 17:41:23.588356 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-user-template-provider-selection podName:1e8b5999-2184-44c0-8b0c-da9c76a82b41 nodeName:}" failed. No retries permitted until 2026-02-27 17:41:24.088339297 +0000 UTC m=+383.995156408 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-user-template-provider-selection" (UniqueName: "kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-user-template-provider-selection") pod "oauth-openshift-7f445d97b7-ht8m5" (UID: "1e8b5999-2184-44c0-8b0c-da9c76a82b41") : failed to sync secret cache: timed out waiting for the condition Feb 27 17:41:23 crc kubenswrapper[4752]: E0227 17:41:23.598334 4752 projected.go:288] Couldn't get configMap openshift-authentication/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Feb 27 17:41:23 crc kubenswrapper[4752]: I0227 17:41:23.666311 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 27 17:41:23 crc kubenswrapper[4752]: I0227 17:41:23.827584 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 27 17:41:23 crc kubenswrapper[4752]: I0227 17:41:23.829970 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 27 17:41:23 crc kubenswrapper[4752]: I0227 17:41:23.836889 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 27 17:41:23 crc kubenswrapper[4752]: I0227 17:41:23.839131 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 27 17:41:23 crc kubenswrapper[4752]: I0227 17:41:23.853432 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 27 17:41:23 crc kubenswrapper[4752]: I0227 17:41:23.875652 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 27 17:41:23 crc kubenswrapper[4752]: I0227 17:41:23.912758 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 27 17:41:23 crc kubenswrapper[4752]: I0227 17:41:23.936568 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 27 17:41:23 crc kubenswrapper[4752]: I0227 17:41:23.942314 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 27 17:41:23 crc kubenswrapper[4752]: E0227 17:41:23.948470 4752 projected.go:194] Error preparing data for projected volume kube-api-access-dqgr9 for pod openshift-authentication/oauth-openshift-7f445d97b7-ht8m5: failed to sync configmap cache: timed out waiting for the condition Feb 27 17:41:23 crc kubenswrapper[4752]: E0227 17:41:23.948549 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1e8b5999-2184-44c0-8b0c-da9c76a82b41-kube-api-access-dqgr9 podName:1e8b5999-2184-44c0-8b0c-da9c76a82b41 nodeName:}" failed. No retries permitted until 2026-02-27 17:41:24.448528294 +0000 UTC m=+384.355345155 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-dqgr9" (UniqueName: "kubernetes.io/projected/1e8b5999-2184-44c0-8b0c-da9c76a82b41-kube-api-access-dqgr9") pod "oauth-openshift-7f445d97b7-ht8m5" (UID: "1e8b5999-2184-44c0-8b0c-da9c76a82b41") : failed to sync configmap cache: timed out waiting for the condition Feb 27 17:41:23 crc kubenswrapper[4752]: I0227 17:41:23.959023 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 27 17:41:24 crc kubenswrapper[4752]: I0227 17:41:24.109378 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-user-template-error\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:24 crc kubenswrapper[4752]: I0227 17:41:24.109536 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-system-service-ca\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:24 crc kubenswrapper[4752]: I0227 17:41:24.109575 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-system-session\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:24 crc kubenswrapper[4752]: I0227 17:41:24.109610 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-user-template-login\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:24 crc kubenswrapper[4752]: I0227 17:41:24.109651 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:24 crc kubenswrapper[4752]: I0227 17:41:24.109699 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:24 crc kubenswrapper[4752]: I0227 17:41:24.109759 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-system-router-certs\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:24 crc kubenswrapper[4752]: I0227 17:41:24.109811 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:24 crc kubenswrapper[4752]: I0227 17:41:24.110698 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-system-service-ca\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:24 crc kubenswrapper[4752]: I0227 17:41:24.110697 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:24 crc kubenswrapper[4752]: I0227 17:41:24.113389 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:24 crc kubenswrapper[4752]: I0227 17:41:24.113812 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-user-template-error\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:24 crc kubenswrapper[4752]: I0227 17:41:24.113836 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-user-template-login\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:24 crc kubenswrapper[4752]: I0227 17:41:24.115413 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-system-router-certs\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:24 crc kubenswrapper[4752]: I0227 17:41:24.118647 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-system-session\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:24 crc kubenswrapper[4752]: I0227 17:41:24.128023 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1e8b5999-2184-44c0-8b0c-da9c76a82b41-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:24 crc kubenswrapper[4752]: I0227 17:41:24.515339 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqgr9\" (UniqueName: \"kubernetes.io/projected/1e8b5999-2184-44c0-8b0c-da9c76a82b41-kube-api-access-dqgr9\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:24 crc kubenswrapper[4752]: I0227 17:41:24.520226 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqgr9\" (UniqueName: \"kubernetes.io/projected/1e8b5999-2184-44c0-8b0c-da9c76a82b41-kube-api-access-dqgr9\") pod \"oauth-openshift-7f445d97b7-ht8m5\" (UID: \"1e8b5999-2184-44c0-8b0c-da9c76a82b41\") " pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:24 crc kubenswrapper[4752]: I0227 17:41:24.561086 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:25 crc kubenswrapper[4752]: I0227 17:41:25.022923 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7f445d97b7-ht8m5"] Feb 27 17:41:25 crc kubenswrapper[4752]: I0227 17:41:25.068844 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zhhxr" Feb 27 17:41:26 crc kubenswrapper[4752]: I0227 17:41:26.009932 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" event={"ID":"1e8b5999-2184-44c0-8b0c-da9c76a82b41","Type":"ContainerStarted","Data":"4e19e030d07c312be53220efe06db572c944a8ae02df6a567cb56ecb445ebc8d"} Feb 27 17:41:26 crc kubenswrapper[4752]: I0227 17:41:26.010327 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:26 crc kubenswrapper[4752]: I0227 17:41:26.010343 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" event={"ID":"1e8b5999-2184-44c0-8b0c-da9c76a82b41","Type":"ContainerStarted","Data":"29f726a5ef0876de41106ce4376b0f7bb52d7ec81076d58325d411743b69ede3"} Feb 27 17:41:26 crc kubenswrapper[4752]: I0227 17:41:26.019876 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" Feb 27 17:41:26 crc kubenswrapper[4752]: I0227 17:41:26.031904 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7f445d97b7-ht8m5" podStartSLOduration=62.031877 podStartE2EDuration="1m2.031877s" podCreationTimestamp="2026-02-27 17:40:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:41:26.030523777 +0000 UTC m=+385.937340668" watchObservedRunningTime="2026-02-27 17:41:26.031877 +0000 UTC m=+385.938693871" Feb 27 17:41:31 crc kubenswrapper[4752]: I0227 17:41:31.094672 4752 ???:1] "http: TLS handshake error from 192.168.126.11:36114: no serving certificate available for the kubelet" Feb 27 17:41:42 crc kubenswrapper[4752]: I0227 17:41:42.140646 4752 generic.go:334] "Generic (PLEG): container finished" podID="bb16b639-2f9c-414f-8cae-41f805a10165" containerID="119ea4121e14069301ef9da8681aafac5477338f17d5d477e4dada537b14306d" exitCode=0 Feb 27 17:41:42 crc kubenswrapper[4752]: I0227 17:41:42.140705 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fcc4l" event={"ID":"bb16b639-2f9c-414f-8cae-41f805a10165","Type":"ContainerDied","Data":"119ea4121e14069301ef9da8681aafac5477338f17d5d477e4dada537b14306d"} Feb 27 17:41:42 crc kubenswrapper[4752]: I0227 17:41:42.141541 4752 scope.go:117] "RemoveContainer" containerID="119ea4121e14069301ef9da8681aafac5477338f17d5d477e4dada537b14306d" Feb 27 17:41:43 crc kubenswrapper[4752]: I0227 17:41:43.149757 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fcc4l" event={"ID":"bb16b639-2f9c-414f-8cae-41f805a10165","Type":"ContainerStarted","Data":"f4d851b9c4185f6911f9a4bc4792f1845bb54d6de1540ddbfaa0c2cd6de11446"} Feb 27 17:41:43 crc kubenswrapper[4752]: I0227 17:41:43.152061 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-fcc4l" Feb 27 17:41:43 crc kubenswrapper[4752]: I0227 17:41:43.154288 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-fcc4l" Feb 27 17:41:47 crc kubenswrapper[4752]: I0227 17:41:47.177714 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 27 17:41:47 crc kubenswrapper[4752]: I0227 17:41:47.180229 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 27 17:41:47 crc kubenswrapper[4752]: I0227 17:41:47.180702 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 27 17:41:47 crc kubenswrapper[4752]: I0227 17:41:47.180753 4752 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="baef1f6e280c915b001f7a0379f2968e57cef4f145842aca2033b42379dcf233" exitCode=137 Feb 27 17:41:47 crc kubenswrapper[4752]: I0227 17:41:47.180784 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"baef1f6e280c915b001f7a0379f2968e57cef4f145842aca2033b42379dcf233"} Feb 27 17:41:47 crc kubenswrapper[4752]: I0227 17:41:47.180823 4752 scope.go:117] "RemoveContainer" containerID="b07158ca3ff468fe4b8147ba5f4a6ec9e775d214521e3d5a99d9e7dde15e760a" Feb 27 17:41:48 crc kubenswrapper[4752]: I0227 17:41:48.187832 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 27 17:41:48 crc kubenswrapper[4752]: I0227 17:41:48.189073 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 27 17:41:48 crc kubenswrapper[4752]: I0227 17:41:48.189114 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a265367ea1b5f72a3fcd865bb7964d73d3d489db67c436a32a22ede29e62bcb2"} Feb 27 17:41:54 crc kubenswrapper[4752]: I0227 17:41:54.946887 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 17:41:57 crc kubenswrapper[4752]: I0227 17:41:57.000605 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 17:41:57 crc kubenswrapper[4752]: I0227 17:41:57.006061 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 17:41:57 crc kubenswrapper[4752]: I0227 17:41:57.251330 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 27 17:41:57 crc kubenswrapper[4752]: I0227 17:41:57.977212 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 27 17:42:04 crc kubenswrapper[4752]: I0227 17:42:04.943720 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536902-bmcrh"] Feb 27 17:42:04 crc kubenswrapper[4752]: I0227 17:42:04.944769 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536902-bmcrh" Feb 27 17:42:04 crc kubenswrapper[4752]: I0227 17:42:04.946475 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wtt6d" Feb 27 17:42:04 crc kubenswrapper[4752]: I0227 17:42:04.947047 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 17:42:04 crc kubenswrapper[4752]: I0227 17:42:04.948415 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 17:42:04 crc kubenswrapper[4752]: I0227 17:42:04.960015 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536902-bmcrh"] Feb 27 17:42:05 crc kubenswrapper[4752]: I0227 17:42:05.084915 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flprm\" (UniqueName: \"kubernetes.io/projected/d475667f-7381-41d5-9e84-e20e48cef57e-kube-api-access-flprm\") pod \"auto-csr-approver-29536902-bmcrh\" (UID: \"d475667f-7381-41d5-9e84-e20e48cef57e\") " pod="openshift-infra/auto-csr-approver-29536902-bmcrh" Feb 27 17:42:05 crc kubenswrapper[4752]: I0227 17:42:05.185815 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flprm\" (UniqueName: \"kubernetes.io/projected/d475667f-7381-41d5-9e84-e20e48cef57e-kube-api-access-flprm\") pod \"auto-csr-approver-29536902-bmcrh\" (UID: \"d475667f-7381-41d5-9e84-e20e48cef57e\") " pod="openshift-infra/auto-csr-approver-29536902-bmcrh" Feb 27 17:42:05 crc kubenswrapper[4752]: I0227 17:42:05.208668 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flprm\" (UniqueName: \"kubernetes.io/projected/d475667f-7381-41d5-9e84-e20e48cef57e-kube-api-access-flprm\") pod \"auto-csr-approver-29536902-bmcrh\" (UID: \"d475667f-7381-41d5-9e84-e20e48cef57e\") " pod="openshift-infra/auto-csr-approver-29536902-bmcrh" Feb 27 17:42:05 crc kubenswrapper[4752]: I0227 17:42:05.260913 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536902-bmcrh" Feb 27 17:42:05 crc kubenswrapper[4752]: I0227 17:42:05.670100 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536902-bmcrh"] Feb 27 17:42:06 crc kubenswrapper[4752]: I0227 17:42:06.330992 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536902-bmcrh" event={"ID":"d475667f-7381-41d5-9e84-e20e48cef57e","Type":"ContainerStarted","Data":"a23a6f48a1dc0b0af77350e9168c9f67a5b91b810e4fc46a8ee05b21334a8ca4"} Feb 27 17:42:06 crc kubenswrapper[4752]: E0227 17:42:06.743762 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 27 17:42:06 crc kubenswrapper[4752]: E0227 17:42:06.743935 4752 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 17:42:06 crc kubenswrapper[4752]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 27 17:42:06 crc kubenswrapper[4752]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-flprm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29536902-bmcrh_openshift-infra(d475667f-7381-41d5-9e84-e20e48cef57e): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error) Feb 27 17:42:06 crc kubenswrapper[4752]: > logger="UnhandledError" Feb 27 17:42:06 crc kubenswrapper[4752]: E0227 17:42:06.745102 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)\"" pod="openshift-infra/auto-csr-approver-29536902-bmcrh" podUID="d475667f-7381-41d5-9e84-e20e48cef57e" Feb 27 17:42:07 crc kubenswrapper[4752]: E0227 17:42:07.341479 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536902-bmcrh" podUID="d475667f-7381-41d5-9e84-e20e48cef57e" Feb 27 17:42:10 crc kubenswrapper[4752]: E0227 17:42:10.915052 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error)" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 27 17:42:10 crc kubenswrapper[4752]: E0227 17:42:10.916393 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-92dcf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-6kwhk_openshift-marketplace(78323811-0abf-4cc6-921c-5d0e56e895a3): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 17:42:10 crc kubenswrapper[4752]: E0227 17:42:10.919271 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error)\"" pod="openshift-marketplace/redhat-marketplace-6kwhk" podUID="78323811-0abf-4cc6-921c-5d0e56e895a3" Feb 27 17:42:19 crc kubenswrapper[4752]: E0227 17:42:19.809179 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 27 17:42:19 crc kubenswrapper[4752]: E0227 17:42:19.809985 4752 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 17:42:19 crc kubenswrapper[4752]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 27 17:42:19 crc kubenswrapper[4752]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-flprm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29536902-bmcrh_openshift-infra(d475667f-7381-41d5-9e84-e20e48cef57e): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error) Feb 27 17:42:19 crc kubenswrapper[4752]: > logger="UnhandledError" Feb 27 17:42:19 crc kubenswrapper[4752]: E0227 17:42:19.811245 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)\"" pod="openshift-infra/auto-csr-approver-29536902-bmcrh" podUID="d475667f-7381-41d5-9e84-e20e48cef57e" Feb 27 17:42:22 crc kubenswrapper[4752]: E0227 17:42:22.909136 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-6kwhk" podUID="78323811-0abf-4cc6-921c-5d0e56e895a3" Feb 27 17:42:31 crc kubenswrapper[4752]: E0227 17:42:31.908337 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536902-bmcrh" podUID="d475667f-7381-41d5-9e84-e20e48cef57e" Feb 27 17:42:35 crc kubenswrapper[4752]: E0227 17:42:35.908698 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-6kwhk" podUID="78323811-0abf-4cc6-921c-5d0e56e895a3" Feb 27 17:42:36 crc kubenswrapper[4752]: I0227 17:42:36.323730 4752 patch_prober.go:28] interesting pod/machine-config-daemon-cm8wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 17:42:36 crc kubenswrapper[4752]: I0227 17:42:36.323814 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 17:42:37 crc kubenswrapper[4752]: I0227 17:42:37.151824 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-ppdjt"] Feb 27 17:42:37 crc kubenswrapper[4752]: I0227 17:42:37.152521 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-ppdjt" Feb 27 17:42:37 crc kubenswrapper[4752]: I0227 17:42:37.165658 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-ppdjt"] Feb 27 17:42:37 crc kubenswrapper[4752]: I0227 17:42:37.265498 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-ppdjt\" (UID: \"75528b60-0ddc-43ed-8bc0-ee6cbd516be2\") " pod="openshift-image-registry/image-registry-66df7c8f76-ppdjt" Feb 27 17:42:37 crc kubenswrapper[4752]: I0227 17:42:37.265547 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/75528b60-0ddc-43ed-8bc0-ee6cbd516be2-registry-certificates\") pod \"image-registry-66df7c8f76-ppdjt\" (UID: \"75528b60-0ddc-43ed-8bc0-ee6cbd516be2\") " pod="openshift-image-registry/image-registry-66df7c8f76-ppdjt" Feb 27 17:42:37 crc kubenswrapper[4752]: I0227 17:42:37.265603 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/75528b60-0ddc-43ed-8bc0-ee6cbd516be2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-ppdjt\" (UID: \"75528b60-0ddc-43ed-8bc0-ee6cbd516be2\") " pod="openshift-image-registry/image-registry-66df7c8f76-ppdjt" Feb 27 17:42:37 crc kubenswrapper[4752]: I0227 17:42:37.265634 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/75528b60-0ddc-43ed-8bc0-ee6cbd516be2-bound-sa-token\") pod \"image-registry-66df7c8f76-ppdjt\" (UID: \"75528b60-0ddc-43ed-8bc0-ee6cbd516be2\") " pod="openshift-image-registry/image-registry-66df7c8f76-ppdjt" Feb 27 17:42:37 crc kubenswrapper[4752]: I0227 17:42:37.265659 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/75528b60-0ddc-43ed-8bc0-ee6cbd516be2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-ppdjt\" (UID: \"75528b60-0ddc-43ed-8bc0-ee6cbd516be2\") " pod="openshift-image-registry/image-registry-66df7c8f76-ppdjt" Feb 27 17:42:37 crc kubenswrapper[4752]: I0227 17:42:37.265867 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l567d\" (UniqueName: \"kubernetes.io/projected/75528b60-0ddc-43ed-8bc0-ee6cbd516be2-kube-api-access-l567d\") pod \"image-registry-66df7c8f76-ppdjt\" (UID: \"75528b60-0ddc-43ed-8bc0-ee6cbd516be2\") " pod="openshift-image-registry/image-registry-66df7c8f76-ppdjt" Feb 27 17:42:37 crc kubenswrapper[4752]: I0227 17:42:37.265952 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/75528b60-0ddc-43ed-8bc0-ee6cbd516be2-registry-tls\") pod \"image-registry-66df7c8f76-ppdjt\" (UID: \"75528b60-0ddc-43ed-8bc0-ee6cbd516be2\") " pod="openshift-image-registry/image-registry-66df7c8f76-ppdjt" Feb 27 17:42:37 crc kubenswrapper[4752]: I0227 17:42:37.266019 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/75528b60-0ddc-43ed-8bc0-ee6cbd516be2-trusted-ca\") pod \"image-registry-66df7c8f76-ppdjt\" (UID: \"75528b60-0ddc-43ed-8bc0-ee6cbd516be2\") " pod="openshift-image-registry/image-registry-66df7c8f76-ppdjt" Feb 27 17:42:37 crc kubenswrapper[4752]: I0227 17:42:37.305228 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-ppdjt\" (UID: \"75528b60-0ddc-43ed-8bc0-ee6cbd516be2\") " pod="openshift-image-registry/image-registry-66df7c8f76-ppdjt" Feb 27 17:42:37 crc kubenswrapper[4752]: I0227 17:42:37.367621 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/75528b60-0ddc-43ed-8bc0-ee6cbd516be2-trusted-ca\") pod \"image-registry-66df7c8f76-ppdjt\" (UID: \"75528b60-0ddc-43ed-8bc0-ee6cbd516be2\") " pod="openshift-image-registry/image-registry-66df7c8f76-ppdjt" Feb 27 17:42:37 crc kubenswrapper[4752]: I0227 17:42:37.367946 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/75528b60-0ddc-43ed-8bc0-ee6cbd516be2-registry-certificates\") pod \"image-registry-66df7c8f76-ppdjt\" (UID: \"75528b60-0ddc-43ed-8bc0-ee6cbd516be2\") " pod="openshift-image-registry/image-registry-66df7c8f76-ppdjt" Feb 27 17:42:37 crc kubenswrapper[4752]: I0227 17:42:37.368073 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/75528b60-0ddc-43ed-8bc0-ee6cbd516be2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-ppdjt\" (UID: \"75528b60-0ddc-43ed-8bc0-ee6cbd516be2\") " pod="openshift-image-registry/image-registry-66df7c8f76-ppdjt" Feb 27 17:42:37 crc kubenswrapper[4752]: I0227 17:42:37.368203 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/75528b60-0ddc-43ed-8bc0-ee6cbd516be2-bound-sa-token\") pod \"image-registry-66df7c8f76-ppdjt\" (UID: \"75528b60-0ddc-43ed-8bc0-ee6cbd516be2\") " pod="openshift-image-registry/image-registry-66df7c8f76-ppdjt" Feb 27 17:42:37 crc kubenswrapper[4752]: I0227 17:42:37.368303 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/75528b60-0ddc-43ed-8bc0-ee6cbd516be2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-ppdjt\" (UID: \"75528b60-0ddc-43ed-8bc0-ee6cbd516be2\") " pod="openshift-image-registry/image-registry-66df7c8f76-ppdjt" Feb 27 17:42:37 crc kubenswrapper[4752]: I0227 17:42:37.368420 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l567d\" (UniqueName: \"kubernetes.io/projected/75528b60-0ddc-43ed-8bc0-ee6cbd516be2-kube-api-access-l567d\") pod \"image-registry-66df7c8f76-ppdjt\" (UID: \"75528b60-0ddc-43ed-8bc0-ee6cbd516be2\") " pod="openshift-image-registry/image-registry-66df7c8f76-ppdjt" Feb 27 17:42:37 crc kubenswrapper[4752]: I0227 17:42:37.368784 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/75528b60-0ddc-43ed-8bc0-ee6cbd516be2-registry-tls\") pod \"image-registry-66df7c8f76-ppdjt\" (UID: \"75528b60-0ddc-43ed-8bc0-ee6cbd516be2\") " pod="openshift-image-registry/image-registry-66df7c8f76-ppdjt" Feb 27 17:42:37 crc kubenswrapper[4752]: I0227 17:42:37.369224 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/75528b60-0ddc-43ed-8bc0-ee6cbd516be2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-ppdjt\" (UID: \"75528b60-0ddc-43ed-8bc0-ee6cbd516be2\") " pod="openshift-image-registry/image-registry-66df7c8f76-ppdjt" Feb 27 17:42:37 crc kubenswrapper[4752]: I0227 17:42:37.369702 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/75528b60-0ddc-43ed-8bc0-ee6cbd516be2-registry-certificates\") pod \"image-registry-66df7c8f76-ppdjt\" (UID: \"75528b60-0ddc-43ed-8bc0-ee6cbd516be2\") " pod="openshift-image-registry/image-registry-66df7c8f76-ppdjt" Feb 27 17:42:37 crc kubenswrapper[4752]: I0227 17:42:37.369739 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/75528b60-0ddc-43ed-8bc0-ee6cbd516be2-trusted-ca\") pod \"image-registry-66df7c8f76-ppdjt\" (UID: \"75528b60-0ddc-43ed-8bc0-ee6cbd516be2\") " pod="openshift-image-registry/image-registry-66df7c8f76-ppdjt" Feb 27 17:42:37 crc kubenswrapper[4752]: I0227 17:42:37.376900 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/75528b60-0ddc-43ed-8bc0-ee6cbd516be2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-ppdjt\" (UID: \"75528b60-0ddc-43ed-8bc0-ee6cbd516be2\") " pod="openshift-image-registry/image-registry-66df7c8f76-ppdjt" Feb 27 17:42:37 crc kubenswrapper[4752]: I0227 17:42:37.377052 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/75528b60-0ddc-43ed-8bc0-ee6cbd516be2-registry-tls\") pod \"image-registry-66df7c8f76-ppdjt\" (UID: \"75528b60-0ddc-43ed-8bc0-ee6cbd516be2\") " pod="openshift-image-registry/image-registry-66df7c8f76-ppdjt" Feb 27 17:42:37 crc kubenswrapper[4752]: I0227 17:42:37.387957 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l567d\" (UniqueName: \"kubernetes.io/projected/75528b60-0ddc-43ed-8bc0-ee6cbd516be2-kube-api-access-l567d\") pod \"image-registry-66df7c8f76-ppdjt\" (UID: \"75528b60-0ddc-43ed-8bc0-ee6cbd516be2\") " pod="openshift-image-registry/image-registry-66df7c8f76-ppdjt" Feb 27 17:42:37 crc kubenswrapper[4752]: I0227 17:42:37.389319 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/75528b60-0ddc-43ed-8bc0-ee6cbd516be2-bound-sa-token\") pod \"image-registry-66df7c8f76-ppdjt\" (UID: \"75528b60-0ddc-43ed-8bc0-ee6cbd516be2\") " pod="openshift-image-registry/image-registry-66df7c8f76-ppdjt" Feb 27 17:42:37 crc kubenswrapper[4752]: I0227 17:42:37.467176 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-ppdjt" Feb 27 17:42:37 crc kubenswrapper[4752]: I0227 17:42:37.905445 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-ppdjt"] Feb 27 17:42:38 crc kubenswrapper[4752]: I0227 17:42:38.479662 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b7x2z"] Feb 27 17:42:38 crc kubenswrapper[4752]: I0227 17:42:38.480421 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b7x2z" podUID="899d1101-b4de-4326-b442-6450903b2a30" containerName="registry-server" containerID="cri-o://984ceef2fb711d42d08bb0a2d6a63af0580b9abdaeeb8fc873b503f3824870c3" gracePeriod=2 Feb 27 17:42:38 crc kubenswrapper[4752]: I0227 17:42:38.539598 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-ppdjt" event={"ID":"75528b60-0ddc-43ed-8bc0-ee6cbd516be2","Type":"ContainerStarted","Data":"8c56f7faca5068d10b122d83c5ed7e3cbebf078cee1b8e1d2bc503e3d22d9114"} Feb 27 17:42:38 crc kubenswrapper[4752]: I0227 17:42:38.539646 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-ppdjt" event={"ID":"75528b60-0ddc-43ed-8bc0-ee6cbd516be2","Type":"ContainerStarted","Data":"a034a7af97ee8b95e9acc40c68ca641f36a49bc8d2d9d0352e22df91b9eb019e"} Feb 27 17:42:38 crc kubenswrapper[4752]: I0227 17:42:38.539781 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-ppdjt" Feb 27 17:42:38 crc kubenswrapper[4752]: I0227 17:42:38.562737 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-ppdjt" podStartSLOduration=1.562714711 podStartE2EDuration="1.562714711s" podCreationTimestamp="2026-02-27 17:42:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:42:38.559913361 +0000 UTC m=+458.466730232" watchObservedRunningTime="2026-02-27 17:42:38.562714711 +0000 UTC m=+458.469531572" Feb 27 17:42:38 crc kubenswrapper[4752]: I0227 17:42:38.929226 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b7x2z" Feb 27 17:42:38 crc kubenswrapper[4752]: I0227 17:42:38.993845 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/899d1101-b4de-4326-b442-6450903b2a30-utilities\") pod \"899d1101-b4de-4326-b442-6450903b2a30\" (UID: \"899d1101-b4de-4326-b442-6450903b2a30\") " Feb 27 17:42:38 crc kubenswrapper[4752]: I0227 17:42:38.993908 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/899d1101-b4de-4326-b442-6450903b2a30-catalog-content\") pod \"899d1101-b4de-4326-b442-6450903b2a30\" (UID: \"899d1101-b4de-4326-b442-6450903b2a30\") " Feb 27 17:42:38 crc kubenswrapper[4752]: I0227 17:42:38.993977 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrvgh\" (UniqueName: \"kubernetes.io/projected/899d1101-b4de-4326-b442-6450903b2a30-kube-api-access-wrvgh\") pod \"899d1101-b4de-4326-b442-6450903b2a30\" (UID: \"899d1101-b4de-4326-b442-6450903b2a30\") " Feb 27 17:42:38 crc kubenswrapper[4752]: I0227 17:42:38.995647 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/899d1101-b4de-4326-b442-6450903b2a30-utilities" (OuterVolumeSpecName: "utilities") pod "899d1101-b4de-4326-b442-6450903b2a30" (UID: "899d1101-b4de-4326-b442-6450903b2a30"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 17:42:39 crc kubenswrapper[4752]: I0227 17:42:39.001894 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/899d1101-b4de-4326-b442-6450903b2a30-kube-api-access-wrvgh" (OuterVolumeSpecName: "kube-api-access-wrvgh") pod "899d1101-b4de-4326-b442-6450903b2a30" (UID: "899d1101-b4de-4326-b442-6450903b2a30"). InnerVolumeSpecName "kube-api-access-wrvgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:42:39 crc kubenswrapper[4752]: I0227 17:42:39.047412 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/899d1101-b4de-4326-b442-6450903b2a30-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "899d1101-b4de-4326-b442-6450903b2a30" (UID: "899d1101-b4de-4326-b442-6450903b2a30"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 17:42:39 crc kubenswrapper[4752]: I0227 17:42:39.095827 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/899d1101-b4de-4326-b442-6450903b2a30-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 17:42:39 crc kubenswrapper[4752]: I0227 17:42:39.095860 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/899d1101-b4de-4326-b442-6450903b2a30-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 17:42:39 crc kubenswrapper[4752]: I0227 17:42:39.095874 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrvgh\" (UniqueName: \"kubernetes.io/projected/899d1101-b4de-4326-b442-6450903b2a30-kube-api-access-wrvgh\") on node \"crc\" DevicePath \"\"" Feb 27 17:42:39 crc kubenswrapper[4752]: I0227 17:42:39.549034 4752 generic.go:334] "Generic (PLEG): container finished" podID="899d1101-b4de-4326-b442-6450903b2a30" containerID="984ceef2fb711d42d08bb0a2d6a63af0580b9abdaeeb8fc873b503f3824870c3" exitCode=0 Feb 27 17:42:39 crc kubenswrapper[4752]: I0227 17:42:39.549114 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b7x2z" event={"ID":"899d1101-b4de-4326-b442-6450903b2a30","Type":"ContainerDied","Data":"984ceef2fb711d42d08bb0a2d6a63af0580b9abdaeeb8fc873b503f3824870c3"} Feb 27 17:42:39 crc kubenswrapper[4752]: I0227 17:42:39.549215 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b7x2z" event={"ID":"899d1101-b4de-4326-b442-6450903b2a30","Type":"ContainerDied","Data":"a989bc54c41973efce7b203829bacf30e6f75abcef373ba1b2a1d4b779d48764"} Feb 27 17:42:39 crc kubenswrapper[4752]: I0227 17:42:39.549235 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b7x2z" Feb 27 17:42:39 crc kubenswrapper[4752]: I0227 17:42:39.549270 4752 scope.go:117] "RemoveContainer" containerID="984ceef2fb711d42d08bb0a2d6a63af0580b9abdaeeb8fc873b503f3824870c3" Feb 27 17:42:39 crc kubenswrapper[4752]: I0227 17:42:39.574706 4752 scope.go:117] "RemoveContainer" containerID="e4d188dc042311fb27dd7900e2bb57c7bbf3adb883157a4b72f817d21418a554" Feb 27 17:42:39 crc kubenswrapper[4752]: I0227 17:42:39.606748 4752 scope.go:117] "RemoveContainer" containerID="1256f9673c37e0b3c9e151af931c3405e0a589bc9518d95a73406e42e9d096c2" Feb 27 17:42:39 crc kubenswrapper[4752]: I0227 17:42:39.606850 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b7x2z"] Feb 27 17:42:39 crc kubenswrapper[4752]: I0227 17:42:39.619962 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b7x2z"] Feb 27 17:42:39 crc kubenswrapper[4752]: I0227 17:42:39.639403 4752 scope.go:117] "RemoveContainer" containerID="984ceef2fb711d42d08bb0a2d6a63af0580b9abdaeeb8fc873b503f3824870c3" Feb 27 17:42:39 crc kubenswrapper[4752]: E0227 17:42:39.640104 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"984ceef2fb711d42d08bb0a2d6a63af0580b9abdaeeb8fc873b503f3824870c3\": container with ID starting with 984ceef2fb711d42d08bb0a2d6a63af0580b9abdaeeb8fc873b503f3824870c3 not found: ID does not exist" containerID="984ceef2fb711d42d08bb0a2d6a63af0580b9abdaeeb8fc873b503f3824870c3" Feb 27 17:42:39 crc kubenswrapper[4752]: I0227 17:42:39.640171 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"984ceef2fb711d42d08bb0a2d6a63af0580b9abdaeeb8fc873b503f3824870c3"} err="failed to get container status \"984ceef2fb711d42d08bb0a2d6a63af0580b9abdaeeb8fc873b503f3824870c3\": rpc error: code = NotFound desc = could not find container \"984ceef2fb711d42d08bb0a2d6a63af0580b9abdaeeb8fc873b503f3824870c3\": container with ID starting with 984ceef2fb711d42d08bb0a2d6a63af0580b9abdaeeb8fc873b503f3824870c3 not found: ID does not exist" Feb 27 17:42:39 crc kubenswrapper[4752]: I0227 17:42:39.640198 4752 scope.go:117] "RemoveContainer" containerID="e4d188dc042311fb27dd7900e2bb57c7bbf3adb883157a4b72f817d21418a554" Feb 27 17:42:39 crc kubenswrapper[4752]: E0227 17:42:39.640764 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4d188dc042311fb27dd7900e2bb57c7bbf3adb883157a4b72f817d21418a554\": container with ID starting with e4d188dc042311fb27dd7900e2bb57c7bbf3adb883157a4b72f817d21418a554 not found: ID does not exist" containerID="e4d188dc042311fb27dd7900e2bb57c7bbf3adb883157a4b72f817d21418a554" Feb 27 17:42:39 crc kubenswrapper[4752]: I0227 17:42:39.640789 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4d188dc042311fb27dd7900e2bb57c7bbf3adb883157a4b72f817d21418a554"} err="failed to get container status \"e4d188dc042311fb27dd7900e2bb57c7bbf3adb883157a4b72f817d21418a554\": rpc error: code = NotFound desc = could not find container \"e4d188dc042311fb27dd7900e2bb57c7bbf3adb883157a4b72f817d21418a554\": container with ID starting with e4d188dc042311fb27dd7900e2bb57c7bbf3adb883157a4b72f817d21418a554 not found: ID does not exist" Feb 27 17:42:39 crc kubenswrapper[4752]: I0227 17:42:39.640806 4752 scope.go:117] "RemoveContainer" containerID="1256f9673c37e0b3c9e151af931c3405e0a589bc9518d95a73406e42e9d096c2" Feb 27 17:42:39 crc kubenswrapper[4752]: E0227 17:42:39.641256 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1256f9673c37e0b3c9e151af931c3405e0a589bc9518d95a73406e42e9d096c2\": container with ID starting with 1256f9673c37e0b3c9e151af931c3405e0a589bc9518d95a73406e42e9d096c2 not found: ID does not exist" containerID="1256f9673c37e0b3c9e151af931c3405e0a589bc9518d95a73406e42e9d096c2" Feb 27 17:42:39 crc kubenswrapper[4752]: I0227 17:42:39.641305 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1256f9673c37e0b3c9e151af931c3405e0a589bc9518d95a73406e42e9d096c2"} err="failed to get container status \"1256f9673c37e0b3c9e151af931c3405e0a589bc9518d95a73406e42e9d096c2\": rpc error: code = NotFound desc = could not find container \"1256f9673c37e0b3c9e151af931c3405e0a589bc9518d95a73406e42e9d096c2\": container with ID starting with 1256f9673c37e0b3c9e151af931c3405e0a589bc9518d95a73406e42e9d096c2 not found: ID does not exist" Feb 27 17:42:40 crc kubenswrapper[4752]: I0227 17:42:40.281171 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zhhxr"] Feb 27 17:42:40 crc kubenswrapper[4752]: I0227 17:42:40.281566 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zhhxr" podUID="760298d8-7405-4c9e-b322-b08dbc182da8" containerName="registry-server" containerID="cri-o://634f55a9c540561019fb4db92385729210f79b4f6236eee7b113aec7b8692aa9" gracePeriod=2 Feb 27 17:42:40 crc kubenswrapper[4752]: I0227 17:42:40.562160 4752 generic.go:334] "Generic (PLEG): container finished" podID="760298d8-7405-4c9e-b322-b08dbc182da8" containerID="634f55a9c540561019fb4db92385729210f79b4f6236eee7b113aec7b8692aa9" exitCode=0 Feb 27 17:42:40 crc kubenswrapper[4752]: I0227 17:42:40.562264 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zhhxr" event={"ID":"760298d8-7405-4c9e-b322-b08dbc182da8","Type":"ContainerDied","Data":"634f55a9c540561019fb4db92385729210f79b4f6236eee7b113aec7b8692aa9"} Feb 27 17:42:40 crc kubenswrapper[4752]: I0227 17:42:40.755738 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zhhxr" Feb 27 17:42:40 crc kubenswrapper[4752]: I0227 17:42:40.820798 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/760298d8-7405-4c9e-b322-b08dbc182da8-utilities\") pod \"760298d8-7405-4c9e-b322-b08dbc182da8\" (UID: \"760298d8-7405-4c9e-b322-b08dbc182da8\") " Feb 27 17:42:40 crc kubenswrapper[4752]: I0227 17:42:40.820872 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/760298d8-7405-4c9e-b322-b08dbc182da8-catalog-content\") pod \"760298d8-7405-4c9e-b322-b08dbc182da8\" (UID: \"760298d8-7405-4c9e-b322-b08dbc182da8\") " Feb 27 17:42:40 crc kubenswrapper[4752]: I0227 17:42:40.820921 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24ds4\" (UniqueName: \"kubernetes.io/projected/760298d8-7405-4c9e-b322-b08dbc182da8-kube-api-access-24ds4\") pod \"760298d8-7405-4c9e-b322-b08dbc182da8\" (UID: \"760298d8-7405-4c9e-b322-b08dbc182da8\") " Feb 27 17:42:40 crc kubenswrapper[4752]: I0227 17:42:40.823125 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/760298d8-7405-4c9e-b322-b08dbc182da8-utilities" (OuterVolumeSpecName: "utilities") pod "760298d8-7405-4c9e-b322-b08dbc182da8" (UID: "760298d8-7405-4c9e-b322-b08dbc182da8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 17:42:40 crc kubenswrapper[4752]: I0227 17:42:40.828570 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/760298d8-7405-4c9e-b322-b08dbc182da8-kube-api-access-24ds4" (OuterVolumeSpecName: "kube-api-access-24ds4") pod "760298d8-7405-4c9e-b322-b08dbc182da8" (UID: "760298d8-7405-4c9e-b322-b08dbc182da8"). InnerVolumeSpecName "kube-api-access-24ds4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:42:40 crc kubenswrapper[4752]: I0227 17:42:40.852859 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/760298d8-7405-4c9e-b322-b08dbc182da8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "760298d8-7405-4c9e-b322-b08dbc182da8" (UID: "760298d8-7405-4c9e-b322-b08dbc182da8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 17:42:40 crc kubenswrapper[4752]: I0227 17:42:40.913303 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="899d1101-b4de-4326-b442-6450903b2a30" path="/var/lib/kubelet/pods/899d1101-b4de-4326-b442-6450903b2a30/volumes" Feb 27 17:42:40 crc kubenswrapper[4752]: I0227 17:42:40.922868 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/760298d8-7405-4c9e-b322-b08dbc182da8-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 17:42:40 crc kubenswrapper[4752]: I0227 17:42:40.922894 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/760298d8-7405-4c9e-b322-b08dbc182da8-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 17:42:40 crc kubenswrapper[4752]: I0227 17:42:40.922907 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24ds4\" (UniqueName: \"kubernetes.io/projected/760298d8-7405-4c9e-b322-b08dbc182da8-kube-api-access-24ds4\") on node \"crc\" DevicePath \"\"" Feb 27 17:42:41 crc kubenswrapper[4752]: I0227 17:42:41.577110 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zhhxr" event={"ID":"760298d8-7405-4c9e-b322-b08dbc182da8","Type":"ContainerDied","Data":"d4deb7355bc7c0c217ad810f134a92b0cfaa53583b26a0ed7bcb975e092a45a8"} Feb 27 17:42:41 crc kubenswrapper[4752]: I0227 17:42:41.577190 4752 scope.go:117] "RemoveContainer" containerID="634f55a9c540561019fb4db92385729210f79b4f6236eee7b113aec7b8692aa9" Feb 27 17:42:41 crc kubenswrapper[4752]: I0227 17:42:41.577196 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zhhxr" Feb 27 17:42:41 crc kubenswrapper[4752]: I0227 17:42:41.605500 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zhhxr"] Feb 27 17:42:41 crc kubenswrapper[4752]: I0227 17:42:41.614541 4752 scope.go:117] "RemoveContainer" containerID="34c530e0969c24c2e99daec12a3da7d311bff2fb274f0c137017c469be50ba5f" Feb 27 17:42:41 crc kubenswrapper[4752]: I0227 17:42:41.614778 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zhhxr"] Feb 27 17:42:41 crc kubenswrapper[4752]: I0227 17:42:41.644226 4752 scope.go:117] "RemoveContainer" containerID="fa128acf65ea08e40b355cfdea643b4a3d524d680578e4d277243d36ce63dc01" Feb 27 17:42:42 crc kubenswrapper[4752]: I0227 17:42:42.922094 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="760298d8-7405-4c9e-b322-b08dbc182da8" path="/var/lib/kubelet/pods/760298d8-7405-4c9e-b322-b08dbc182da8/volumes" Feb 27 17:42:45 crc kubenswrapper[4752]: I0227 17:42:45.047543 4752 csr.go:261] certificate signing request csr-vx2ps is approved, waiting to be issued Feb 27 17:42:45 crc kubenswrapper[4752]: I0227 17:42:45.071240 4752 csr.go:257] certificate signing request csr-vx2ps is issued Feb 27 17:42:45 crc kubenswrapper[4752]: E0227 17:42:45.161943 4752 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd475667f_7381_41d5_9e84_e20e48cef57e.slice/crio-conmon-b361613190f429b39ca0c0063f059f538ca42e9326823d5e46a5e6ff925985d2.scope\": RecentStats: unable to find data in memory cache]" Feb 27 17:42:45 crc kubenswrapper[4752]: I0227 17:42:45.609465 4752 generic.go:334] "Generic (PLEG): container finished" podID="d475667f-7381-41d5-9e84-e20e48cef57e" containerID="b361613190f429b39ca0c0063f059f538ca42e9326823d5e46a5e6ff925985d2" exitCode=0 Feb 27 17:42:45 crc kubenswrapper[4752]: I0227 17:42:45.609540 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536902-bmcrh" event={"ID":"d475667f-7381-41d5-9e84-e20e48cef57e","Type":"ContainerDied","Data":"b361613190f429b39ca0c0063f059f538ca42e9326823d5e46a5e6ff925985d2"} Feb 27 17:42:46 crc kubenswrapper[4752]: I0227 17:42:46.072613 4752 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-01 20:09:18.064684675 +0000 UTC Feb 27 17:42:46 crc kubenswrapper[4752]: I0227 17:42:46.073405 4752 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7394h26m31.991286775s for next certificate rotation Feb 27 17:42:46 crc kubenswrapper[4752]: I0227 17:42:46.992486 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536902-bmcrh" Feb 27 17:42:47 crc kubenswrapper[4752]: I0227 17:42:47.074268 4752 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-04 19:03:58.501060047 +0000 UTC Feb 27 17:42:47 crc kubenswrapper[4752]: I0227 17:42:47.074310 4752 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7465h21m11.426753851s for next certificate rotation Feb 27 17:42:47 crc kubenswrapper[4752]: I0227 17:42:47.121117 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flprm\" (UniqueName: \"kubernetes.io/projected/d475667f-7381-41d5-9e84-e20e48cef57e-kube-api-access-flprm\") pod \"d475667f-7381-41d5-9e84-e20e48cef57e\" (UID: \"d475667f-7381-41d5-9e84-e20e48cef57e\") " Feb 27 17:42:47 crc kubenswrapper[4752]: I0227 17:42:47.127301 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d475667f-7381-41d5-9e84-e20e48cef57e-kube-api-access-flprm" (OuterVolumeSpecName: "kube-api-access-flprm") pod "d475667f-7381-41d5-9e84-e20e48cef57e" (UID: "d475667f-7381-41d5-9e84-e20e48cef57e"). InnerVolumeSpecName "kube-api-access-flprm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:42:47 crc kubenswrapper[4752]: I0227 17:42:47.225389 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flprm\" (UniqueName: \"kubernetes.io/projected/d475667f-7381-41d5-9e84-e20e48cef57e-kube-api-access-flprm\") on node \"crc\" DevicePath \"\"" Feb 27 17:42:47 crc kubenswrapper[4752]: I0227 17:42:47.625105 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536902-bmcrh" event={"ID":"d475667f-7381-41d5-9e84-e20e48cef57e","Type":"ContainerDied","Data":"a23a6f48a1dc0b0af77350e9168c9f67a5b91b810e4fc46a8ee05b21334a8ca4"} Feb 27 17:42:47 crc kubenswrapper[4752]: I0227 17:42:47.625184 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a23a6f48a1dc0b0af77350e9168c9f67a5b91b810e4fc46a8ee05b21334a8ca4" Feb 27 17:42:47 crc kubenswrapper[4752]: I0227 17:42:47.625642 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536902-bmcrh" Feb 27 17:42:49 crc kubenswrapper[4752]: E0227 17:42:49.910247 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-6kwhk" podUID="78323811-0abf-4cc6-921c-5d0e56e895a3" Feb 27 17:42:57 crc kubenswrapper[4752]: I0227 17:42:57.479530 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-ppdjt" Feb 27 17:42:57 crc kubenswrapper[4752]: I0227 17:42:57.556764 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-r47g5"] Feb 27 17:43:01 crc kubenswrapper[4752]: E0227 17:43:01.909432 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-6kwhk" podUID="78323811-0abf-4cc6-921c-5d0e56e895a3" Feb 27 17:43:06 crc kubenswrapper[4752]: I0227 17:43:06.325208 4752 patch_prober.go:28] interesting pod/machine-config-daemon-cm8wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 17:43:06 crc kubenswrapper[4752]: I0227 17:43:06.325811 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 17:43:06 crc kubenswrapper[4752]: I0227 17:43:06.919070 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t8w4t"] Feb 27 17:43:06 crc kubenswrapper[4752]: I0227 17:43:06.919941 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-t8w4t" podUID="ebdbb722-11b5-43c4-b8dc-8758bbc7164c" containerName="registry-server" containerID="cri-o://8aeb418c116a39ee31cd0026e29f7c590cba724acc16d7b4a09d9ae052d82151" gracePeriod=30 Feb 27 17:43:06 crc kubenswrapper[4752]: I0227 17:43:06.928062 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dm9bt"] Feb 27 17:43:06 crc kubenswrapper[4752]: I0227 17:43:06.928412 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dm9bt" podUID="cad177e6-5ee1-4884-bb19-b9413b183acc" containerName="registry-server" containerID="cri-o://2e123599c2c2d3b21164b0784b7b129fe10571f86105f71e3121be9af312e409" gracePeriod=30 Feb 27 17:43:06 crc kubenswrapper[4752]: I0227 17:43:06.937684 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fcc4l"] Feb 27 17:43:06 crc kubenswrapper[4752]: I0227 17:43:06.937874 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-fcc4l" podUID="bb16b639-2f9c-414f-8cae-41f805a10165" containerName="marketplace-operator" containerID="cri-o://f4d851b9c4185f6911f9a4bc4792f1845bb54d6de1540ddbfaa0c2cd6de11446" gracePeriod=30 Feb 27 17:43:06 crc kubenswrapper[4752]: I0227 17:43:06.959729 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6kwhk"] Feb 27 17:43:06 crc kubenswrapper[4752]: I0227 17:43:06.969391 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qvj4t"] Feb 27 17:43:06 crc kubenswrapper[4752]: I0227 17:43:06.969607 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qvj4t" podUID="137184c7-4f82-4685-89fa-d5152358e216" containerName="registry-server" containerID="cri-o://7723bf417f6f623b8becc93eec0cdf5ee3c260aab62b35c502e303e6b0624320" gracePeriod=30 Feb 27 17:43:06 crc kubenswrapper[4752]: I0227 17:43:06.982164 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-s9kms"] Feb 27 17:43:06 crc kubenswrapper[4752]: E0227 17:43:06.982382 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="899d1101-b4de-4326-b442-6450903b2a30" containerName="extract-utilities" Feb 27 17:43:06 crc kubenswrapper[4752]: I0227 17:43:06.982403 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="899d1101-b4de-4326-b442-6450903b2a30" containerName="extract-utilities" Feb 27 17:43:06 crc kubenswrapper[4752]: E0227 17:43:06.982416 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="760298d8-7405-4c9e-b322-b08dbc182da8" containerName="registry-server" Feb 27 17:43:06 crc kubenswrapper[4752]: I0227 17:43:06.982422 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="760298d8-7405-4c9e-b322-b08dbc182da8" containerName="registry-server" Feb 27 17:43:06 crc kubenswrapper[4752]: E0227 17:43:06.982436 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="899d1101-b4de-4326-b442-6450903b2a30" containerName="extract-content" Feb 27 17:43:06 crc kubenswrapper[4752]: I0227 17:43:06.982443 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="899d1101-b4de-4326-b442-6450903b2a30" containerName="extract-content" Feb 27 17:43:06 crc kubenswrapper[4752]: E0227 17:43:06.982453 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="760298d8-7405-4c9e-b322-b08dbc182da8" containerName="extract-content" Feb 27 17:43:06 crc kubenswrapper[4752]: I0227 17:43:06.982458 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="760298d8-7405-4c9e-b322-b08dbc182da8" containerName="extract-content" Feb 27 17:43:06 crc kubenswrapper[4752]: E0227 17:43:06.982467 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="760298d8-7405-4c9e-b322-b08dbc182da8" containerName="extract-utilities" Feb 27 17:43:06 crc kubenswrapper[4752]: I0227 17:43:06.982473 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="760298d8-7405-4c9e-b322-b08dbc182da8" containerName="extract-utilities" Feb 27 17:43:06 crc kubenswrapper[4752]: E0227 17:43:06.982486 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="899d1101-b4de-4326-b442-6450903b2a30" containerName="registry-server" Feb 27 17:43:06 crc kubenswrapper[4752]: I0227 17:43:06.982491 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="899d1101-b4de-4326-b442-6450903b2a30" containerName="registry-server" Feb 27 17:43:06 crc kubenswrapper[4752]: E0227 17:43:06.982498 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d475667f-7381-41d5-9e84-e20e48cef57e" containerName="oc" Feb 27 17:43:06 crc kubenswrapper[4752]: I0227 17:43:06.982504 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="d475667f-7381-41d5-9e84-e20e48cef57e" containerName="oc" Feb 27 17:43:06 crc kubenswrapper[4752]: I0227 17:43:06.982598 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="760298d8-7405-4c9e-b322-b08dbc182da8" containerName="registry-server" Feb 27 17:43:06 crc kubenswrapper[4752]: I0227 17:43:06.982608 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="899d1101-b4de-4326-b442-6450903b2a30" containerName="registry-server" Feb 27 17:43:06 crc kubenswrapper[4752]: I0227 17:43:06.982620 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="d475667f-7381-41d5-9e84-e20e48cef57e" containerName="oc" Feb 27 17:43:06 crc kubenswrapper[4752]: I0227 17:43:06.983008 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-s9kms" Feb 27 17:43:06 crc kubenswrapper[4752]: I0227 17:43:06.990018 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-s9kms"] Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.140935 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbdwp\" (UniqueName: \"kubernetes.io/projected/96220b7c-718c-4eca-b3e3-46d1143e6124-kube-api-access-mbdwp\") pod \"marketplace-operator-79b997595-s9kms\" (UID: \"96220b7c-718c-4eca-b3e3-46d1143e6124\") " pod="openshift-marketplace/marketplace-operator-79b997595-s9kms" Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.141224 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/96220b7c-718c-4eca-b3e3-46d1143e6124-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-s9kms\" (UID: \"96220b7c-718c-4eca-b3e3-46d1143e6124\") " pod="openshift-marketplace/marketplace-operator-79b997595-s9kms" Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.141306 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/96220b7c-718c-4eca-b3e3-46d1143e6124-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-s9kms\" (UID: \"96220b7c-718c-4eca-b3e3-46d1143e6124\") " pod="openshift-marketplace/marketplace-operator-79b997595-s9kms" Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.242090 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbdwp\" (UniqueName: \"kubernetes.io/projected/96220b7c-718c-4eca-b3e3-46d1143e6124-kube-api-access-mbdwp\") pod \"marketplace-operator-79b997595-s9kms\" (UID: \"96220b7c-718c-4eca-b3e3-46d1143e6124\") " pod="openshift-marketplace/marketplace-operator-79b997595-s9kms" Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.242139 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/96220b7c-718c-4eca-b3e3-46d1143e6124-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-s9kms\" (UID: \"96220b7c-718c-4eca-b3e3-46d1143e6124\") " pod="openshift-marketplace/marketplace-operator-79b997595-s9kms" Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.242218 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/96220b7c-718c-4eca-b3e3-46d1143e6124-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-s9kms\" (UID: \"96220b7c-718c-4eca-b3e3-46d1143e6124\") " pod="openshift-marketplace/marketplace-operator-79b997595-s9kms" Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.243542 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/96220b7c-718c-4eca-b3e3-46d1143e6124-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-s9kms\" (UID: \"96220b7c-718c-4eca-b3e3-46d1143e6124\") " pod="openshift-marketplace/marketplace-operator-79b997595-s9kms" Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.254404 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/96220b7c-718c-4eca-b3e3-46d1143e6124-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-s9kms\" (UID: \"96220b7c-718c-4eca-b3e3-46d1143e6124\") " pod="openshift-marketplace/marketplace-operator-79b997595-s9kms" Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.262352 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbdwp\" (UniqueName: \"kubernetes.io/projected/96220b7c-718c-4eca-b3e3-46d1143e6124-kube-api-access-mbdwp\") pod \"marketplace-operator-79b997595-s9kms\" (UID: \"96220b7c-718c-4eca-b3e3-46d1143e6124\") " pod="openshift-marketplace/marketplace-operator-79b997595-s9kms" Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.316769 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-s9kms" Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.363473 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6kwhk" Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.369263 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qvj4t" Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.443669 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78323811-0abf-4cc6-921c-5d0e56e895a3-catalog-content\") pod \"78323811-0abf-4cc6-921c-5d0e56e895a3\" (UID: \"78323811-0abf-4cc6-921c-5d0e56e895a3\") " Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.443734 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78323811-0abf-4cc6-921c-5d0e56e895a3-utilities\") pod \"78323811-0abf-4cc6-921c-5d0e56e895a3\" (UID: \"78323811-0abf-4cc6-921c-5d0e56e895a3\") " Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.443786 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/137184c7-4f82-4685-89fa-d5152358e216-utilities\") pod \"137184c7-4f82-4685-89fa-d5152358e216\" (UID: \"137184c7-4f82-4685-89fa-d5152358e216\") " Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.443819 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l262m\" (UniqueName: \"kubernetes.io/projected/137184c7-4f82-4685-89fa-d5152358e216-kube-api-access-l262m\") pod \"137184c7-4f82-4685-89fa-d5152358e216\" (UID: \"137184c7-4f82-4685-89fa-d5152358e216\") " Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.443847 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/137184c7-4f82-4685-89fa-d5152358e216-catalog-content\") pod \"137184c7-4f82-4685-89fa-d5152358e216\" (UID: \"137184c7-4f82-4685-89fa-d5152358e216\") " Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.443883 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92dcf\" (UniqueName: \"kubernetes.io/projected/78323811-0abf-4cc6-921c-5d0e56e895a3-kube-api-access-92dcf\") pod \"78323811-0abf-4cc6-921c-5d0e56e895a3\" (UID: \"78323811-0abf-4cc6-921c-5d0e56e895a3\") " Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.444459 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78323811-0abf-4cc6-921c-5d0e56e895a3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "78323811-0abf-4cc6-921c-5d0e56e895a3" (UID: "78323811-0abf-4cc6-921c-5d0e56e895a3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.445064 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78323811-0abf-4cc6-921c-5d0e56e895a3-utilities" (OuterVolumeSpecName: "utilities") pod "78323811-0abf-4cc6-921c-5d0e56e895a3" (UID: "78323811-0abf-4cc6-921c-5d0e56e895a3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.445995 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/137184c7-4f82-4685-89fa-d5152358e216-utilities" (OuterVolumeSpecName: "utilities") pod "137184c7-4f82-4685-89fa-d5152358e216" (UID: "137184c7-4f82-4685-89fa-d5152358e216"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.447271 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/137184c7-4f82-4685-89fa-d5152358e216-kube-api-access-l262m" (OuterVolumeSpecName: "kube-api-access-l262m") pod "137184c7-4f82-4685-89fa-d5152358e216" (UID: "137184c7-4f82-4685-89fa-d5152358e216"). InnerVolumeSpecName "kube-api-access-l262m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.447528 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78323811-0abf-4cc6-921c-5d0e56e895a3-kube-api-access-92dcf" (OuterVolumeSpecName: "kube-api-access-92dcf") pod "78323811-0abf-4cc6-921c-5d0e56e895a3" (UID: "78323811-0abf-4cc6-921c-5d0e56e895a3"). InnerVolumeSpecName "kube-api-access-92dcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.545961 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/137184c7-4f82-4685-89fa-d5152358e216-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.546014 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l262m\" (UniqueName: \"kubernetes.io/projected/137184c7-4f82-4685-89fa-d5152358e216-kube-api-access-l262m\") on node \"crc\" DevicePath \"\"" Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.546027 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92dcf\" (UniqueName: \"kubernetes.io/projected/78323811-0abf-4cc6-921c-5d0e56e895a3-kube-api-access-92dcf\") on node \"crc\" DevicePath \"\"" Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.546038 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78323811-0abf-4cc6-921c-5d0e56e895a3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.546052 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78323811-0abf-4cc6-921c-5d0e56e895a3-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.567107 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/137184c7-4f82-4685-89fa-d5152358e216-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "137184c7-4f82-4685-89fa-d5152358e216" (UID: "137184c7-4f82-4685-89fa-d5152358e216"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.648501 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/137184c7-4f82-4685-89fa-d5152358e216-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.736599 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-s9kms"] Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.763827 4752 generic.go:334] "Generic (PLEG): container finished" podID="cad177e6-5ee1-4884-bb19-b9413b183acc" containerID="2e123599c2c2d3b21164b0784b7b129fe10571f86105f71e3121be9af312e409" exitCode=0 Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.763915 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dm9bt" event={"ID":"cad177e6-5ee1-4884-bb19-b9413b183acc","Type":"ContainerDied","Data":"2e123599c2c2d3b21164b0784b7b129fe10571f86105f71e3121be9af312e409"} Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.767730 4752 generic.go:334] "Generic (PLEG): container finished" podID="ebdbb722-11b5-43c4-b8dc-8758bbc7164c" containerID="8aeb418c116a39ee31cd0026e29f7c590cba724acc16d7b4a09d9ae052d82151" exitCode=0 Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.767818 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8w4t" event={"ID":"ebdbb722-11b5-43c4-b8dc-8758bbc7164c","Type":"ContainerDied","Data":"8aeb418c116a39ee31cd0026e29f7c590cba724acc16d7b4a09d9ae052d82151"} Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.770873 4752 generic.go:334] "Generic (PLEG): container finished" podID="137184c7-4f82-4685-89fa-d5152358e216" containerID="7723bf417f6f623b8becc93eec0cdf5ee3c260aab62b35c502e303e6b0624320" exitCode=0 Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.770939 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qvj4t" event={"ID":"137184c7-4f82-4685-89fa-d5152358e216","Type":"ContainerDied","Data":"7723bf417f6f623b8becc93eec0cdf5ee3c260aab62b35c502e303e6b0624320"} Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.770964 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qvj4t" event={"ID":"137184c7-4f82-4685-89fa-d5152358e216","Type":"ContainerDied","Data":"b71ab18bb56ca20e3c29d69e08973645a3cec4a658e437a266506e2cf59bf5b0"} Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.770986 4752 scope.go:117] "RemoveContainer" containerID="7723bf417f6f623b8becc93eec0cdf5ee3c260aab62b35c502e303e6b0624320" Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.771116 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qvj4t" Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.785419 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-s9kms" event={"ID":"96220b7c-718c-4eca-b3e3-46d1143e6124","Type":"ContainerStarted","Data":"262cb43a43a8834f0e78dd0adb03ef1d73365e6ade2297813f694aad46bce657"} Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.788903 4752 generic.go:334] "Generic (PLEG): container finished" podID="bb16b639-2f9c-414f-8cae-41f805a10165" containerID="f4d851b9c4185f6911f9a4bc4792f1845bb54d6de1540ddbfaa0c2cd6de11446" exitCode=0 Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.788969 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fcc4l" event={"ID":"bb16b639-2f9c-414f-8cae-41f805a10165","Type":"ContainerDied","Data":"f4d851b9c4185f6911f9a4bc4792f1845bb54d6de1540ddbfaa0c2cd6de11446"} Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.791183 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6kwhk" event={"ID":"78323811-0abf-4cc6-921c-5d0e56e895a3","Type":"ContainerDied","Data":"80a6ca63d9072e0788ee8e2f438f632f56651e2a5afba730fab068d44a8eaa5e"} Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.791278 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6kwhk" Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.819576 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qvj4t"] Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.820812 4752 scope.go:117] "RemoveContainer" containerID="9b2a66ff1a9bfe5541ad6a86192ff7e124123487a2168bb9bdee799ca9a092a6" Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.826130 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qvj4t"] Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.873113 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6kwhk"] Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.878333 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6kwhk"] Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.900125 4752 scope.go:117] "RemoveContainer" containerID="2df76a3f751acf725660d64460266ee916899aa32644c9d16606b37d321a7d4b" Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.927992 4752 scope.go:117] "RemoveContainer" containerID="7723bf417f6f623b8becc93eec0cdf5ee3c260aab62b35c502e303e6b0624320" Feb 27 17:43:07 crc kubenswrapper[4752]: E0227 17:43:07.929382 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7723bf417f6f623b8becc93eec0cdf5ee3c260aab62b35c502e303e6b0624320\": container with ID starting with 7723bf417f6f623b8becc93eec0cdf5ee3c260aab62b35c502e303e6b0624320 not found: ID does not exist" containerID="7723bf417f6f623b8becc93eec0cdf5ee3c260aab62b35c502e303e6b0624320" Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.929433 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7723bf417f6f623b8becc93eec0cdf5ee3c260aab62b35c502e303e6b0624320"} err="failed to get container status \"7723bf417f6f623b8becc93eec0cdf5ee3c260aab62b35c502e303e6b0624320\": rpc error: code = NotFound desc = could not find container \"7723bf417f6f623b8becc93eec0cdf5ee3c260aab62b35c502e303e6b0624320\": container with ID starting with 7723bf417f6f623b8becc93eec0cdf5ee3c260aab62b35c502e303e6b0624320 not found: ID does not exist" Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.929464 4752 scope.go:117] "RemoveContainer" containerID="9b2a66ff1a9bfe5541ad6a86192ff7e124123487a2168bb9bdee799ca9a092a6" Feb 27 17:43:07 crc kubenswrapper[4752]: E0227 17:43:07.929921 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b2a66ff1a9bfe5541ad6a86192ff7e124123487a2168bb9bdee799ca9a092a6\": container with ID starting with 9b2a66ff1a9bfe5541ad6a86192ff7e124123487a2168bb9bdee799ca9a092a6 not found: ID does not exist" containerID="9b2a66ff1a9bfe5541ad6a86192ff7e124123487a2168bb9bdee799ca9a092a6" Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.929949 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b2a66ff1a9bfe5541ad6a86192ff7e124123487a2168bb9bdee799ca9a092a6"} err="failed to get container status \"9b2a66ff1a9bfe5541ad6a86192ff7e124123487a2168bb9bdee799ca9a092a6\": rpc error: code = NotFound desc = could not find container \"9b2a66ff1a9bfe5541ad6a86192ff7e124123487a2168bb9bdee799ca9a092a6\": container with ID starting with 9b2a66ff1a9bfe5541ad6a86192ff7e124123487a2168bb9bdee799ca9a092a6 not found: ID does not exist" Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.929966 4752 scope.go:117] "RemoveContainer" containerID="2df76a3f751acf725660d64460266ee916899aa32644c9d16606b37d321a7d4b" Feb 27 17:43:07 crc kubenswrapper[4752]: E0227 17:43:07.930317 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2df76a3f751acf725660d64460266ee916899aa32644c9d16606b37d321a7d4b\": container with ID starting with 2df76a3f751acf725660d64460266ee916899aa32644c9d16606b37d321a7d4b not found: ID does not exist" containerID="2df76a3f751acf725660d64460266ee916899aa32644c9d16606b37d321a7d4b" Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.930374 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2df76a3f751acf725660d64460266ee916899aa32644c9d16606b37d321a7d4b"} err="failed to get container status \"2df76a3f751acf725660d64460266ee916899aa32644c9d16606b37d321a7d4b\": rpc error: code = NotFound desc = could not find container \"2df76a3f751acf725660d64460266ee916899aa32644c9d16606b37d321a7d4b\": container with ID starting with 2df76a3f751acf725660d64460266ee916899aa32644c9d16606b37d321a7d4b not found: ID does not exist" Feb 27 17:43:07 crc kubenswrapper[4752]: I0227 17:43:07.930393 4752 scope.go:117] "RemoveContainer" containerID="119ea4121e14069301ef9da8681aafac5477338f17d5d477e4dada537b14306d" Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.126307 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dm9bt" Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.129776 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t8w4t" Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.134807 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fcc4l" Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.221623 4752 scope.go:117] "RemoveContainer" containerID="81b581e3c047bb180deb6110ca6cb6d537277e7ffaf240d1aecd6e4c6679653f" Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.259696 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebdbb722-11b5-43c4-b8dc-8758bbc7164c-utilities\") pod \"ebdbb722-11b5-43c4-b8dc-8758bbc7164c\" (UID: \"ebdbb722-11b5-43c4-b8dc-8758bbc7164c\") " Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.259748 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cad177e6-5ee1-4884-bb19-b9413b183acc-catalog-content\") pod \"cad177e6-5ee1-4884-bb19-b9413b183acc\" (UID: \"cad177e6-5ee1-4884-bb19-b9413b183acc\") " Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.259806 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d592b\" (UniqueName: \"kubernetes.io/projected/bb16b639-2f9c-414f-8cae-41f805a10165-kube-api-access-d592b\") pod \"bb16b639-2f9c-414f-8cae-41f805a10165\" (UID: \"bb16b639-2f9c-414f-8cae-41f805a10165\") " Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.259837 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bb16b639-2f9c-414f-8cae-41f805a10165-marketplace-operator-metrics\") pod \"bb16b639-2f9c-414f-8cae-41f805a10165\" (UID: \"bb16b639-2f9c-414f-8cae-41f805a10165\") " Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.259899 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rl86p\" (UniqueName: \"kubernetes.io/projected/ebdbb722-11b5-43c4-b8dc-8758bbc7164c-kube-api-access-rl86p\") pod \"ebdbb722-11b5-43c4-b8dc-8758bbc7164c\" (UID: \"ebdbb722-11b5-43c4-b8dc-8758bbc7164c\") " Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.259952 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cad177e6-5ee1-4884-bb19-b9413b183acc-utilities\") pod \"cad177e6-5ee1-4884-bb19-b9413b183acc\" (UID: \"cad177e6-5ee1-4884-bb19-b9413b183acc\") " Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.259985 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb16b639-2f9c-414f-8cae-41f805a10165-marketplace-trusted-ca\") pod \"bb16b639-2f9c-414f-8cae-41f805a10165\" (UID: \"bb16b639-2f9c-414f-8cae-41f805a10165\") " Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.260024 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebdbb722-11b5-43c4-b8dc-8758bbc7164c-catalog-content\") pod \"ebdbb722-11b5-43c4-b8dc-8758bbc7164c\" (UID: \"ebdbb722-11b5-43c4-b8dc-8758bbc7164c\") " Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.260058 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btb9f\" (UniqueName: \"kubernetes.io/projected/cad177e6-5ee1-4884-bb19-b9413b183acc-kube-api-access-btb9f\") pod \"cad177e6-5ee1-4884-bb19-b9413b183acc\" (UID: \"cad177e6-5ee1-4884-bb19-b9413b183acc\") " Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.260850 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebdbb722-11b5-43c4-b8dc-8758bbc7164c-utilities" (OuterVolumeSpecName: "utilities") pod "ebdbb722-11b5-43c4-b8dc-8758bbc7164c" (UID: "ebdbb722-11b5-43c4-b8dc-8758bbc7164c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.261373 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb16b639-2f9c-414f-8cae-41f805a10165-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "bb16b639-2f9c-414f-8cae-41f805a10165" (UID: "bb16b639-2f9c-414f-8cae-41f805a10165"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.261747 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cad177e6-5ee1-4884-bb19-b9413b183acc-utilities" (OuterVolumeSpecName: "utilities") pod "cad177e6-5ee1-4884-bb19-b9413b183acc" (UID: "cad177e6-5ee1-4884-bb19-b9413b183acc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.266104 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebdbb722-11b5-43c4-b8dc-8758bbc7164c-kube-api-access-rl86p" (OuterVolumeSpecName: "kube-api-access-rl86p") pod "ebdbb722-11b5-43c4-b8dc-8758bbc7164c" (UID: "ebdbb722-11b5-43c4-b8dc-8758bbc7164c"). InnerVolumeSpecName "kube-api-access-rl86p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.266652 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cad177e6-5ee1-4884-bb19-b9413b183acc-kube-api-access-btb9f" (OuterVolumeSpecName: "kube-api-access-btb9f") pod "cad177e6-5ee1-4884-bb19-b9413b183acc" (UID: "cad177e6-5ee1-4884-bb19-b9413b183acc"). InnerVolumeSpecName "kube-api-access-btb9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.266852 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb16b639-2f9c-414f-8cae-41f805a10165-kube-api-access-d592b" (OuterVolumeSpecName: "kube-api-access-d592b") pod "bb16b639-2f9c-414f-8cae-41f805a10165" (UID: "bb16b639-2f9c-414f-8cae-41f805a10165"). InnerVolumeSpecName "kube-api-access-d592b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.266926 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb16b639-2f9c-414f-8cae-41f805a10165-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "bb16b639-2f9c-414f-8cae-41f805a10165" (UID: "bb16b639-2f9c-414f-8cae-41f805a10165"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.334025 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebdbb722-11b5-43c4-b8dc-8758bbc7164c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ebdbb722-11b5-43c4-b8dc-8758bbc7164c" (UID: "ebdbb722-11b5-43c4-b8dc-8758bbc7164c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.335497 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cad177e6-5ee1-4884-bb19-b9413b183acc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cad177e6-5ee1-4884-bb19-b9413b183acc" (UID: "cad177e6-5ee1-4884-bb19-b9413b183acc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.361769 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d592b\" (UniqueName: \"kubernetes.io/projected/bb16b639-2f9c-414f-8cae-41f805a10165-kube-api-access-d592b\") on node \"crc\" DevicePath \"\"" Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.361809 4752 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bb16b639-2f9c-414f-8cae-41f805a10165-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.361825 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rl86p\" (UniqueName: \"kubernetes.io/projected/ebdbb722-11b5-43c4-b8dc-8758bbc7164c-kube-api-access-rl86p\") on node \"crc\" DevicePath \"\"" Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.361838 4752 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb16b639-2f9c-414f-8cae-41f805a10165-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.361850 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cad177e6-5ee1-4884-bb19-b9413b183acc-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.361861 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebdbb722-11b5-43c4-b8dc-8758bbc7164c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.361870 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btb9f\" (UniqueName: \"kubernetes.io/projected/cad177e6-5ee1-4884-bb19-b9413b183acc-kube-api-access-btb9f\") on node \"crc\" DevicePath \"\"" Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.361877 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebdbb722-11b5-43c4-b8dc-8758bbc7164c-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.361885 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cad177e6-5ee1-4884-bb19-b9413b183acc-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.801358 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fcc4l" Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.802785 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fcc4l" event={"ID":"bb16b639-2f9c-414f-8cae-41f805a10165","Type":"ContainerDied","Data":"30481aba07e69fc3e26033b104d168a34c5b72a1e529fcab507ccc491b19f3f5"} Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.803026 4752 scope.go:117] "RemoveContainer" containerID="f4d851b9c4185f6911f9a4bc4792f1845bb54d6de1540ddbfaa0c2cd6de11446" Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.806615 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dm9bt" Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.806916 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dm9bt" event={"ID":"cad177e6-5ee1-4884-bb19-b9413b183acc","Type":"ContainerDied","Data":"93833e8fad58d7a12c47f3930647a04cfea06b810ffd8de7518d4734be27553e"} Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.810123 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8w4t" event={"ID":"ebdbb722-11b5-43c4-b8dc-8758bbc7164c","Type":"ContainerDied","Data":"218c6b1bcb4eac96d3646237e251a6dd2118ccaa7c8244de01170a41b8bbfe8b"} Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.810205 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t8w4t" Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.815359 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-s9kms" event={"ID":"96220b7c-718c-4eca-b3e3-46d1143e6124","Type":"ContainerStarted","Data":"ce6f3de1112434618214283a3ca998314d5291c9d6890a227bab4fccbee8c7ff"} Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.816237 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-s9kms" Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.821760 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-s9kms" Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.832362 4752 scope.go:117] "RemoveContainer" containerID="2e123599c2c2d3b21164b0784b7b129fe10571f86105f71e3121be9af312e409" Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.864670 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-s9kms" podStartSLOduration=2.864574693 podStartE2EDuration="2.864574693s" podCreationTimestamp="2026-02-27 17:43:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:43:08.854276636 +0000 UTC m=+488.761093497" watchObservedRunningTime="2026-02-27 17:43:08.864574693 +0000 UTC m=+488.771391584" Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.884206 4752 scope.go:117] "RemoveContainer" containerID="7ba4699f47b6c884353dd851daac687b91155e0bc8b0e36934e7e1b0252f2253" Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.885584 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t8w4t"] Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.889506 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-t8w4t"] Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.916766 4752 scope.go:117] "RemoveContainer" containerID="da05ba83c7087e24142734f5125d0aec25569e538b4475bb5905f4b6eeaa7cc9" Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.918062 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="137184c7-4f82-4685-89fa-d5152358e216" path="/var/lib/kubelet/pods/137184c7-4f82-4685-89fa-d5152358e216/volumes" Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.918976 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78323811-0abf-4cc6-921c-5d0e56e895a3" path="/var/lib/kubelet/pods/78323811-0abf-4cc6-921c-5d0e56e895a3/volumes" Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.919515 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebdbb722-11b5-43c4-b8dc-8758bbc7164c" path="/var/lib/kubelet/pods/ebdbb722-11b5-43c4-b8dc-8758bbc7164c/volumes" Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.920503 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dm9bt"] Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.920540 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dm9bt"] Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.921323 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fcc4l"] Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.925355 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fcc4l"] Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.937790 4752 scope.go:117] "RemoveContainer" containerID="8aeb418c116a39ee31cd0026e29f7c590cba724acc16d7b4a09d9ae052d82151" Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.957672 4752 scope.go:117] "RemoveContainer" containerID="330339f22036915fad436a77ebf48db28ece31cb5269f8e79a29b514b1908f62" Feb 27 17:43:08 crc kubenswrapper[4752]: I0227 17:43:08.980278 4752 scope.go:117] "RemoveContainer" containerID="ccff345bec7c8faae051b8382ed00bf24f3490e117e60693c1104c00b7908e3f" Feb 27 17:43:09 crc kubenswrapper[4752]: I0227 17:43:09.725051 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wpsdf"] Feb 27 17:43:09 crc kubenswrapper[4752]: E0227 17:43:09.725498 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cad177e6-5ee1-4884-bb19-b9413b183acc" containerName="extract-utilities" Feb 27 17:43:09 crc kubenswrapper[4752]: I0227 17:43:09.725510 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="cad177e6-5ee1-4884-bb19-b9413b183acc" containerName="extract-utilities" Feb 27 17:43:09 crc kubenswrapper[4752]: E0227 17:43:09.725521 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="137184c7-4f82-4685-89fa-d5152358e216" containerName="extract-content" Feb 27 17:43:09 crc kubenswrapper[4752]: I0227 17:43:09.725527 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="137184c7-4f82-4685-89fa-d5152358e216" containerName="extract-content" Feb 27 17:43:09 crc kubenswrapper[4752]: E0227 17:43:09.725535 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="137184c7-4f82-4685-89fa-d5152358e216" containerName="extract-utilities" Feb 27 17:43:09 crc kubenswrapper[4752]: I0227 17:43:09.725541 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="137184c7-4f82-4685-89fa-d5152358e216" containerName="extract-utilities" Feb 27 17:43:09 crc kubenswrapper[4752]: E0227 17:43:09.725554 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb16b639-2f9c-414f-8cae-41f805a10165" containerName="marketplace-operator" Feb 27 17:43:09 crc kubenswrapper[4752]: I0227 17:43:09.725560 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb16b639-2f9c-414f-8cae-41f805a10165" containerName="marketplace-operator" Feb 27 17:43:09 crc kubenswrapper[4752]: E0227 17:43:09.725569 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebdbb722-11b5-43c4-b8dc-8758bbc7164c" containerName="extract-utilities" Feb 27 17:43:09 crc kubenswrapper[4752]: I0227 17:43:09.725574 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebdbb722-11b5-43c4-b8dc-8758bbc7164c" containerName="extract-utilities" Feb 27 17:43:09 crc kubenswrapper[4752]: E0227 17:43:09.725585 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebdbb722-11b5-43c4-b8dc-8758bbc7164c" containerName="registry-server" Feb 27 17:43:09 crc kubenswrapper[4752]: I0227 17:43:09.725590 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebdbb722-11b5-43c4-b8dc-8758bbc7164c" containerName="registry-server" Feb 27 17:43:09 crc kubenswrapper[4752]: E0227 17:43:09.725602 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cad177e6-5ee1-4884-bb19-b9413b183acc" containerName="registry-server" Feb 27 17:43:09 crc kubenswrapper[4752]: I0227 17:43:09.725608 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="cad177e6-5ee1-4884-bb19-b9413b183acc" containerName="registry-server" Feb 27 17:43:09 crc kubenswrapper[4752]: E0227 17:43:09.725615 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78323811-0abf-4cc6-921c-5d0e56e895a3" containerName="extract-utilities" Feb 27 17:43:09 crc kubenswrapper[4752]: I0227 17:43:09.725621 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="78323811-0abf-4cc6-921c-5d0e56e895a3" containerName="extract-utilities" Feb 27 17:43:09 crc kubenswrapper[4752]: E0227 17:43:09.725629 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebdbb722-11b5-43c4-b8dc-8758bbc7164c" containerName="extract-content" Feb 27 17:43:09 crc kubenswrapper[4752]: I0227 17:43:09.725634 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebdbb722-11b5-43c4-b8dc-8758bbc7164c" containerName="extract-content" Feb 27 17:43:09 crc kubenswrapper[4752]: E0227 17:43:09.725643 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="137184c7-4f82-4685-89fa-d5152358e216" containerName="registry-server" Feb 27 17:43:09 crc kubenswrapper[4752]: I0227 17:43:09.725649 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="137184c7-4f82-4685-89fa-d5152358e216" containerName="registry-server" Feb 27 17:43:09 crc kubenswrapper[4752]: E0227 17:43:09.725656 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cad177e6-5ee1-4884-bb19-b9413b183acc" containerName="extract-content" Feb 27 17:43:09 crc kubenswrapper[4752]: I0227 17:43:09.725662 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="cad177e6-5ee1-4884-bb19-b9413b183acc" containerName="extract-content" Feb 27 17:43:09 crc kubenswrapper[4752]: I0227 17:43:09.725755 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebdbb722-11b5-43c4-b8dc-8758bbc7164c" containerName="registry-server" Feb 27 17:43:09 crc kubenswrapper[4752]: I0227 17:43:09.725767 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="137184c7-4f82-4685-89fa-d5152358e216" containerName="registry-server" Feb 27 17:43:09 crc kubenswrapper[4752]: I0227 17:43:09.725775 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb16b639-2f9c-414f-8cae-41f805a10165" containerName="marketplace-operator" Feb 27 17:43:09 crc kubenswrapper[4752]: I0227 17:43:09.725780 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="78323811-0abf-4cc6-921c-5d0e56e895a3" containerName="extract-utilities" Feb 27 17:43:09 crc kubenswrapper[4752]: I0227 17:43:09.725788 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb16b639-2f9c-414f-8cae-41f805a10165" containerName="marketplace-operator" Feb 27 17:43:09 crc kubenswrapper[4752]: I0227 17:43:09.725794 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="cad177e6-5ee1-4884-bb19-b9413b183acc" containerName="registry-server" Feb 27 17:43:09 crc kubenswrapper[4752]: E0227 17:43:09.725879 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb16b639-2f9c-414f-8cae-41f805a10165" containerName="marketplace-operator" Feb 27 17:43:09 crc kubenswrapper[4752]: I0227 17:43:09.725886 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb16b639-2f9c-414f-8cae-41f805a10165" containerName="marketplace-operator" Feb 27 17:43:09 crc kubenswrapper[4752]: I0227 17:43:09.726719 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wpsdf" Feb 27 17:43:09 crc kubenswrapper[4752]: I0227 17:43:09.733993 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 27 17:43:09 crc kubenswrapper[4752]: I0227 17:43:09.741860 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wpsdf"] Feb 27 17:43:09 crc kubenswrapper[4752]: I0227 17:43:09.781108 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb10b603-75bd-4c13-b326-bcd1837e25c1-utilities\") pod \"redhat-marketplace-wpsdf\" (UID: \"cb10b603-75bd-4c13-b326-bcd1837e25c1\") " pod="openshift-marketplace/redhat-marketplace-wpsdf" Feb 27 17:43:09 crc kubenswrapper[4752]: I0227 17:43:09.781168 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pgtc\" (UniqueName: \"kubernetes.io/projected/cb10b603-75bd-4c13-b326-bcd1837e25c1-kube-api-access-2pgtc\") pod \"redhat-marketplace-wpsdf\" (UID: \"cb10b603-75bd-4c13-b326-bcd1837e25c1\") " pod="openshift-marketplace/redhat-marketplace-wpsdf" Feb 27 17:43:09 crc kubenswrapper[4752]: I0227 17:43:09.781223 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb10b603-75bd-4c13-b326-bcd1837e25c1-catalog-content\") pod \"redhat-marketplace-wpsdf\" (UID: \"cb10b603-75bd-4c13-b326-bcd1837e25c1\") " pod="openshift-marketplace/redhat-marketplace-wpsdf" Feb 27 17:43:09 crc kubenswrapper[4752]: I0227 17:43:09.882623 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb10b603-75bd-4c13-b326-bcd1837e25c1-utilities\") pod \"redhat-marketplace-wpsdf\" (UID: \"cb10b603-75bd-4c13-b326-bcd1837e25c1\") " pod="openshift-marketplace/redhat-marketplace-wpsdf" Feb 27 17:43:09 crc kubenswrapper[4752]: I0227 17:43:09.882723 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pgtc\" (UniqueName: \"kubernetes.io/projected/cb10b603-75bd-4c13-b326-bcd1837e25c1-kube-api-access-2pgtc\") pod \"redhat-marketplace-wpsdf\" (UID: \"cb10b603-75bd-4c13-b326-bcd1837e25c1\") " pod="openshift-marketplace/redhat-marketplace-wpsdf" Feb 27 17:43:09 crc kubenswrapper[4752]: I0227 17:43:09.882870 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb10b603-75bd-4c13-b326-bcd1837e25c1-catalog-content\") pod \"redhat-marketplace-wpsdf\" (UID: \"cb10b603-75bd-4c13-b326-bcd1837e25c1\") " pod="openshift-marketplace/redhat-marketplace-wpsdf" Feb 27 17:43:09 crc kubenswrapper[4752]: I0227 17:43:09.883637 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb10b603-75bd-4c13-b326-bcd1837e25c1-catalog-content\") pod \"redhat-marketplace-wpsdf\" (UID: \"cb10b603-75bd-4c13-b326-bcd1837e25c1\") " pod="openshift-marketplace/redhat-marketplace-wpsdf" Feb 27 17:43:09 crc kubenswrapper[4752]: I0227 17:43:09.884562 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb10b603-75bd-4c13-b326-bcd1837e25c1-utilities\") pod \"redhat-marketplace-wpsdf\" (UID: \"cb10b603-75bd-4c13-b326-bcd1837e25c1\") " pod="openshift-marketplace/redhat-marketplace-wpsdf" Feb 27 17:43:09 crc kubenswrapper[4752]: I0227 17:43:09.909823 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pgtc\" (UniqueName: \"kubernetes.io/projected/cb10b603-75bd-4c13-b326-bcd1837e25c1-kube-api-access-2pgtc\") pod \"redhat-marketplace-wpsdf\" (UID: \"cb10b603-75bd-4c13-b326-bcd1837e25c1\") " pod="openshift-marketplace/redhat-marketplace-wpsdf" Feb 27 17:43:10 crc kubenswrapper[4752]: I0227 17:43:10.047956 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wpsdf" Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:10.297266 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wpsdf"] Feb 27 17:43:13 crc kubenswrapper[4752]: W0227 17:43:10.304334 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb10b603_75bd_4c13_b326_bcd1837e25c1.slice/crio-3487e3beaf882e7fbaccb2cd36c637d94cf0f96c0a4e9d0ffd8b50fd9057c8cc WatchSource:0}: Error finding container 3487e3beaf882e7fbaccb2cd36c637d94cf0f96c0a4e9d0ffd8b50fd9057c8cc: Status 404 returned error can't find the container with id 3487e3beaf882e7fbaccb2cd36c637d94cf0f96c0a4e9d0ffd8b50fd9057c8cc Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:10.746719 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-s684v"] Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:10.747970 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s684v" Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:10.752433 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:10.782569 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s684v"] Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:10.826784 4752 generic.go:334] "Generic (PLEG): container finished" podID="cb10b603-75bd-4c13-b326-bcd1837e25c1" containerID="f294707104ddb70ba980a4f0c394e3d656f700430ebc5c35f96b0d31f62726ed" exitCode=0 Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:10.826866 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpsdf" event={"ID":"cb10b603-75bd-4c13-b326-bcd1837e25c1","Type":"ContainerDied","Data":"f294707104ddb70ba980a4f0c394e3d656f700430ebc5c35f96b0d31f62726ed"} Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:10.826905 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpsdf" event={"ID":"cb10b603-75bd-4c13-b326-bcd1837e25c1","Type":"ContainerStarted","Data":"3487e3beaf882e7fbaccb2cd36c637d94cf0f96c0a4e9d0ffd8b50fd9057c8cc"} Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:10.898451 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6275\" (UniqueName: \"kubernetes.io/projected/48eed09c-cb79-4247-82f5-05a408fda589-kube-api-access-f6275\") pod \"redhat-operators-s684v\" (UID: \"48eed09c-cb79-4247-82f5-05a408fda589\") " pod="openshift-marketplace/redhat-operators-s684v" Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:10.898519 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48eed09c-cb79-4247-82f5-05a408fda589-catalog-content\") pod \"redhat-operators-s684v\" (UID: \"48eed09c-cb79-4247-82f5-05a408fda589\") " pod="openshift-marketplace/redhat-operators-s684v" Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:10.898611 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48eed09c-cb79-4247-82f5-05a408fda589-utilities\") pod \"redhat-operators-s684v\" (UID: \"48eed09c-cb79-4247-82f5-05a408fda589\") " pod="openshift-marketplace/redhat-operators-s684v" Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:10.912565 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb16b639-2f9c-414f-8cae-41f805a10165" path="/var/lib/kubelet/pods/bb16b639-2f9c-414f-8cae-41f805a10165/volumes" Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:10.913099 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cad177e6-5ee1-4884-bb19-b9413b183acc" path="/var/lib/kubelet/pods/cad177e6-5ee1-4884-bb19-b9413b183acc/volumes" Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:10.999809 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6275\" (UniqueName: \"kubernetes.io/projected/48eed09c-cb79-4247-82f5-05a408fda589-kube-api-access-f6275\") pod \"redhat-operators-s684v\" (UID: \"48eed09c-cb79-4247-82f5-05a408fda589\") " pod="openshift-marketplace/redhat-operators-s684v" Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:10.999854 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48eed09c-cb79-4247-82f5-05a408fda589-catalog-content\") pod \"redhat-operators-s684v\" (UID: \"48eed09c-cb79-4247-82f5-05a408fda589\") " pod="openshift-marketplace/redhat-operators-s684v" Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:10.999891 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48eed09c-cb79-4247-82f5-05a408fda589-utilities\") pod \"redhat-operators-s684v\" (UID: \"48eed09c-cb79-4247-82f5-05a408fda589\") " pod="openshift-marketplace/redhat-operators-s684v" Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:11.000403 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48eed09c-cb79-4247-82f5-05a408fda589-utilities\") pod \"redhat-operators-s684v\" (UID: \"48eed09c-cb79-4247-82f5-05a408fda589\") " pod="openshift-marketplace/redhat-operators-s684v" Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:11.000492 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48eed09c-cb79-4247-82f5-05a408fda589-catalog-content\") pod \"redhat-operators-s684v\" (UID: \"48eed09c-cb79-4247-82f5-05a408fda589\") " pod="openshift-marketplace/redhat-operators-s684v" Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:11.021518 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6275\" (UniqueName: \"kubernetes.io/projected/48eed09c-cb79-4247-82f5-05a408fda589-kube-api-access-f6275\") pod \"redhat-operators-s684v\" (UID: \"48eed09c-cb79-4247-82f5-05a408fda589\") " pod="openshift-marketplace/redhat-operators-s684v" Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:11.113259 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s684v" Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:12.126721 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tv7td"] Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:12.130488 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tv7td" Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:12.143127 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:12.149909 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tv7td"] Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:12.218370 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5295b606-ce89-48c6-809f-36bc6bbfd87f-catalog-content\") pod \"community-operators-tv7td\" (UID: \"5295b606-ce89-48c6-809f-36bc6bbfd87f\") " pod="openshift-marketplace/community-operators-tv7td" Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:12.218418 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhrlz\" (UniqueName: \"kubernetes.io/projected/5295b606-ce89-48c6-809f-36bc6bbfd87f-kube-api-access-zhrlz\") pod \"community-operators-tv7td\" (UID: \"5295b606-ce89-48c6-809f-36bc6bbfd87f\") " pod="openshift-marketplace/community-operators-tv7td" Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:12.218469 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5295b606-ce89-48c6-809f-36bc6bbfd87f-utilities\") pod \"community-operators-tv7td\" (UID: \"5295b606-ce89-48c6-809f-36bc6bbfd87f\") " pod="openshift-marketplace/community-operators-tv7td" Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:12.319829 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5295b606-ce89-48c6-809f-36bc6bbfd87f-utilities\") pod \"community-operators-tv7td\" (UID: \"5295b606-ce89-48c6-809f-36bc6bbfd87f\") " pod="openshift-marketplace/community-operators-tv7td" Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:12.319895 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5295b606-ce89-48c6-809f-36bc6bbfd87f-catalog-content\") pod \"community-operators-tv7td\" (UID: \"5295b606-ce89-48c6-809f-36bc6bbfd87f\") " pod="openshift-marketplace/community-operators-tv7td" Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:12.319921 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhrlz\" (UniqueName: \"kubernetes.io/projected/5295b606-ce89-48c6-809f-36bc6bbfd87f-kube-api-access-zhrlz\") pod \"community-operators-tv7td\" (UID: \"5295b606-ce89-48c6-809f-36bc6bbfd87f\") " pod="openshift-marketplace/community-operators-tv7td" Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:12.320620 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5295b606-ce89-48c6-809f-36bc6bbfd87f-utilities\") pod \"community-operators-tv7td\" (UID: \"5295b606-ce89-48c6-809f-36bc6bbfd87f\") " pod="openshift-marketplace/community-operators-tv7td" Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:12.320721 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5295b606-ce89-48c6-809f-36bc6bbfd87f-catalog-content\") pod \"community-operators-tv7td\" (UID: \"5295b606-ce89-48c6-809f-36bc6bbfd87f\") " pod="openshift-marketplace/community-operators-tv7td" Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:12.338656 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhrlz\" (UniqueName: \"kubernetes.io/projected/5295b606-ce89-48c6-809f-36bc6bbfd87f-kube-api-access-zhrlz\") pod \"community-operators-tv7td\" (UID: \"5295b606-ce89-48c6-809f-36bc6bbfd87f\") " pod="openshift-marketplace/community-operators-tv7td" Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:12.461766 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tv7td" Feb 27 17:43:13 crc kubenswrapper[4752]: E0227 17:43:12.800994 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error)" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 27 17:43:13 crc kubenswrapper[4752]: E0227 17:43:12.801290 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2pgtc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-wpsdf_openshift-marketplace(cb10b603-75bd-4c13-b326-bcd1837e25c1): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 17:43:13 crc kubenswrapper[4752]: E0227 17:43:12.802536 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error)\"" pod="openshift-marketplace/redhat-marketplace-wpsdf" podUID="cb10b603-75bd-4c13-b326-bcd1837e25c1" Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:13.133329 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p45jr"] Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:13.135857 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p45jr" Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:13.138067 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:13.139357 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p45jr"] Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:13.231050 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trbn7\" (UniqueName: \"kubernetes.io/projected/858dd67c-16d0-4e5f-b8a4-a93fec256951-kube-api-access-trbn7\") pod \"certified-operators-p45jr\" (UID: \"858dd67c-16d0-4e5f-b8a4-a93fec256951\") " pod="openshift-marketplace/certified-operators-p45jr" Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:13.231108 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/858dd67c-16d0-4e5f-b8a4-a93fec256951-utilities\") pod \"certified-operators-p45jr\" (UID: \"858dd67c-16d0-4e5f-b8a4-a93fec256951\") " pod="openshift-marketplace/certified-operators-p45jr" Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:13.231197 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/858dd67c-16d0-4e5f-b8a4-a93fec256951-catalog-content\") pod \"certified-operators-p45jr\" (UID: \"858dd67c-16d0-4e5f-b8a4-a93fec256951\") " pod="openshift-marketplace/certified-operators-p45jr" Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:13.332710 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/858dd67c-16d0-4e5f-b8a4-a93fec256951-catalog-content\") pod \"certified-operators-p45jr\" (UID: \"858dd67c-16d0-4e5f-b8a4-a93fec256951\") " pod="openshift-marketplace/certified-operators-p45jr" Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:13.332798 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trbn7\" (UniqueName: \"kubernetes.io/projected/858dd67c-16d0-4e5f-b8a4-a93fec256951-kube-api-access-trbn7\") pod \"certified-operators-p45jr\" (UID: \"858dd67c-16d0-4e5f-b8a4-a93fec256951\") " pod="openshift-marketplace/certified-operators-p45jr" Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:13.332847 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/858dd67c-16d0-4e5f-b8a4-a93fec256951-utilities\") pod \"certified-operators-p45jr\" (UID: \"858dd67c-16d0-4e5f-b8a4-a93fec256951\") " pod="openshift-marketplace/certified-operators-p45jr" Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:13.333382 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/858dd67c-16d0-4e5f-b8a4-a93fec256951-utilities\") pod \"certified-operators-p45jr\" (UID: \"858dd67c-16d0-4e5f-b8a4-a93fec256951\") " pod="openshift-marketplace/certified-operators-p45jr" Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:13.333594 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/858dd67c-16d0-4e5f-b8a4-a93fec256951-catalog-content\") pod \"certified-operators-p45jr\" (UID: \"858dd67c-16d0-4e5f-b8a4-a93fec256951\") " pod="openshift-marketplace/certified-operators-p45jr" Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:13.361583 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trbn7\" (UniqueName: \"kubernetes.io/projected/858dd67c-16d0-4e5f-b8a4-a93fec256951-kube-api-access-trbn7\") pod \"certified-operators-p45jr\" (UID: \"858dd67c-16d0-4e5f-b8a4-a93fec256951\") " pod="openshift-marketplace/certified-operators-p45jr" Feb 27 17:43:13 crc kubenswrapper[4752]: I0227 17:43:13.458602 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p45jr" Feb 27 17:43:14 crc kubenswrapper[4752]: I0227 17:43:14.161999 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s684v"] Feb 27 17:43:14 crc kubenswrapper[4752]: I0227 17:43:14.171770 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tv7td"] Feb 27 17:43:14 crc kubenswrapper[4752]: I0227 17:43:14.182304 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p45jr"] Feb 27 17:43:14 crc kubenswrapper[4752]: W0227 17:43:14.211845 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod858dd67c_16d0_4e5f_b8a4_a93fec256951.slice/crio-84f4c8220a2ebcf424cd49237970741cdb5a2a8e1e1d7fbb8d01169ca727b8d7 WatchSource:0}: Error finding container 84f4c8220a2ebcf424cd49237970741cdb5a2a8e1e1d7fbb8d01169ca727b8d7: Status 404 returned error can't find the container with id 84f4c8220a2ebcf424cd49237970741cdb5a2a8e1e1d7fbb8d01169ca727b8d7 Feb 27 17:43:14 crc kubenswrapper[4752]: I0227 17:43:14.855585 4752 generic.go:334] "Generic (PLEG): container finished" podID="48eed09c-cb79-4247-82f5-05a408fda589" containerID="fbecb440a61bf238a79fa7e8d2271cdbafb82b44b17cf4f1256b3d2435806a3f" exitCode=0 Feb 27 17:43:14 crc kubenswrapper[4752]: I0227 17:43:14.855640 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s684v" event={"ID":"48eed09c-cb79-4247-82f5-05a408fda589","Type":"ContainerDied","Data":"fbecb440a61bf238a79fa7e8d2271cdbafb82b44b17cf4f1256b3d2435806a3f"} Feb 27 17:43:14 crc kubenswrapper[4752]: I0227 17:43:14.855745 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s684v" event={"ID":"48eed09c-cb79-4247-82f5-05a408fda589","Type":"ContainerStarted","Data":"8ef0c55440b71c106479902fbf5abc742f855f4f02fc4cf8e29e89c92152ad12"} Feb 27 17:43:14 crc kubenswrapper[4752]: I0227 17:43:14.861935 4752 generic.go:334] "Generic (PLEG): container finished" podID="5295b606-ce89-48c6-809f-36bc6bbfd87f" containerID="fb299ca86468d68c0fd606cfd7d76970af8a98313d91e2d21e5b32b4de68829f" exitCode=0 Feb 27 17:43:14 crc kubenswrapper[4752]: I0227 17:43:14.861983 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tv7td" event={"ID":"5295b606-ce89-48c6-809f-36bc6bbfd87f","Type":"ContainerDied","Data":"fb299ca86468d68c0fd606cfd7d76970af8a98313d91e2d21e5b32b4de68829f"} Feb 27 17:43:14 crc kubenswrapper[4752]: I0227 17:43:14.861998 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tv7td" event={"ID":"5295b606-ce89-48c6-809f-36bc6bbfd87f","Type":"ContainerStarted","Data":"b09aea4afd847003b9113c14fcc9cc659ed672a2b921f4e373d9944e6839ca3a"} Feb 27 17:43:14 crc kubenswrapper[4752]: I0227 17:43:14.864218 4752 generic.go:334] "Generic (PLEG): container finished" podID="858dd67c-16d0-4e5f-b8a4-a93fec256951" containerID="4295b882a049ef10fc8923bcd09a2a4a7766388ef8b00be9cf97aa622b5e40ce" exitCode=0 Feb 27 17:43:14 crc kubenswrapper[4752]: I0227 17:43:14.864243 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p45jr" event={"ID":"858dd67c-16d0-4e5f-b8a4-a93fec256951","Type":"ContainerDied","Data":"4295b882a049ef10fc8923bcd09a2a4a7766388ef8b00be9cf97aa622b5e40ce"} Feb 27 17:43:14 crc kubenswrapper[4752]: I0227 17:43:14.864259 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p45jr" event={"ID":"858dd67c-16d0-4e5f-b8a4-a93fec256951","Type":"ContainerStarted","Data":"84f4c8220a2ebcf424cd49237970741cdb5a2a8e1e1d7fbb8d01169ca727b8d7"} Feb 27 17:43:15 crc kubenswrapper[4752]: E0227 17:43:15.572322 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/community-operator-index@sha256=886ecdbcb5b8f90338063f6476072fab73c2a9a65b9f2b3835b7bd01c69794c1/signature-2: status 500 (Internal Server Error)" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 27 17:43:15 crc kubenswrapper[4752]: E0227 17:43:15.572475 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zhrlz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-tv7td_openshift-marketplace(5295b606-ce89-48c6-809f-36bc6bbfd87f): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/community-operator-index@sha256=886ecdbcb5b8f90338063f6476072fab73c2a9a65b9f2b3835b7bd01c69794c1/signature-2: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 17:43:15 crc kubenswrapper[4752]: E0227 17:43:15.573563 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/community-operator-index@sha256=886ecdbcb5b8f90338063f6476072fab73c2a9a65b9f2b3835b7bd01c69794c1/signature-2: status 500 (Internal Server Error)\"" pod="openshift-marketplace/community-operators-tv7td" podUID="5295b606-ce89-48c6-809f-36bc6bbfd87f" Feb 27 17:43:15 crc kubenswrapper[4752]: E0227 17:43:15.959599 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-tv7td" podUID="5295b606-ce89-48c6-809f-36bc6bbfd87f" Feb 27 17:43:16 crc kubenswrapper[4752]: I0227 17:43:16.885316 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p45jr" event={"ID":"858dd67c-16d0-4e5f-b8a4-a93fec256951","Type":"ContainerStarted","Data":"c30692a9c099accd6877e8f82aac48975f1402754f83e9c53ca1e76f42acb785"} Feb 27 17:43:16 crc kubenswrapper[4752]: I0227 17:43:16.895367 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s684v" event={"ID":"48eed09c-cb79-4247-82f5-05a408fda589","Type":"ContainerStarted","Data":"d8de8d8cb6ec0d7962a19b12fe204321efcb0b0c527165839dc7ec73404bb598"} Feb 27 17:43:17 crc kubenswrapper[4752]: I0227 17:43:17.905799 4752 generic.go:334] "Generic (PLEG): container finished" podID="48eed09c-cb79-4247-82f5-05a408fda589" containerID="d8de8d8cb6ec0d7962a19b12fe204321efcb0b0c527165839dc7ec73404bb598" exitCode=0 Feb 27 17:43:17 crc kubenswrapper[4752]: I0227 17:43:17.909751 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s684v" event={"ID":"48eed09c-cb79-4247-82f5-05a408fda589","Type":"ContainerDied","Data":"d8de8d8cb6ec0d7962a19b12fe204321efcb0b0c527165839dc7ec73404bb598"} Feb 27 17:43:17 crc kubenswrapper[4752]: I0227 17:43:17.911361 4752 generic.go:334] "Generic (PLEG): container finished" podID="858dd67c-16d0-4e5f-b8a4-a93fec256951" containerID="c30692a9c099accd6877e8f82aac48975f1402754f83e9c53ca1e76f42acb785" exitCode=0 Feb 27 17:43:17 crc kubenswrapper[4752]: I0227 17:43:17.911408 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p45jr" event={"ID":"858dd67c-16d0-4e5f-b8a4-a93fec256951","Type":"ContainerDied","Data":"c30692a9c099accd6877e8f82aac48975f1402754f83e9c53ca1e76f42acb785"} Feb 27 17:43:18 crc kubenswrapper[4752]: I0227 17:43:18.924235 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s684v" event={"ID":"48eed09c-cb79-4247-82f5-05a408fda589","Type":"ContainerStarted","Data":"2ca498a6bd0697abd2940c2cbd9479e65d94dad573c334948f01b7e63fe2a617"} Feb 27 17:43:18 crc kubenswrapper[4752]: I0227 17:43:18.928460 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p45jr" event={"ID":"858dd67c-16d0-4e5f-b8a4-a93fec256951","Type":"ContainerStarted","Data":"8be5bccd48d0841c07587ffb599eb1797c7d6c505cb2c32b8f7ef5744a255805"} Feb 27 17:43:18 crc kubenswrapper[4752]: I0227 17:43:18.952399 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-s684v" podStartSLOduration=5.417205092 podStartE2EDuration="8.952369438s" podCreationTimestamp="2026-02-27 17:43:10 +0000 UTC" firstStartedPulling="2026-02-27 17:43:14.858030038 +0000 UTC m=+494.764846919" lastFinishedPulling="2026-02-27 17:43:18.393194414 +0000 UTC m=+498.300011265" observedRunningTime="2026-02-27 17:43:18.948955473 +0000 UTC m=+498.855772354" watchObservedRunningTime="2026-02-27 17:43:18.952369438 +0000 UTC m=+498.859186329" Feb 27 17:43:21 crc kubenswrapper[4752]: I0227 17:43:21.114002 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-s684v" Feb 27 17:43:21 crc kubenswrapper[4752]: I0227 17:43:21.114097 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-s684v" Feb 27 17:43:22 crc kubenswrapper[4752]: I0227 17:43:22.164719 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-s684v" podUID="48eed09c-cb79-4247-82f5-05a408fda589" containerName="registry-server" probeResult="failure" output=< Feb 27 17:43:22 crc kubenswrapper[4752]: timeout: failed to connect service ":50051" within 1s Feb 27 17:43:22 crc kubenswrapper[4752]: > Feb 27 17:43:22 crc kubenswrapper[4752]: I0227 17:43:22.607432 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" podUID="57573690-e945-43f5-b3ed-e3451f5a8a47" containerName="registry" containerID="cri-o://2a00d9700aac7ca41290f95c8c78dedc3add9ca04dc4b893d762f6a1d1026e12" gracePeriod=30 Feb 27 17:43:22 crc kubenswrapper[4752]: I0227 17:43:22.952380 4752 generic.go:334] "Generic (PLEG): container finished" podID="57573690-e945-43f5-b3ed-e3451f5a8a47" containerID="2a00d9700aac7ca41290f95c8c78dedc3add9ca04dc4b893d762f6a1d1026e12" exitCode=0 Feb 27 17:43:22 crc kubenswrapper[4752]: I0227 17:43:22.952439 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" event={"ID":"57573690-e945-43f5-b3ed-e3451f5a8a47","Type":"ContainerDied","Data":"2a00d9700aac7ca41290f95c8c78dedc3add9ca04dc4b893d762f6a1d1026e12"} Feb 27 17:43:23 crc kubenswrapper[4752]: I0227 17:43:23.459692 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p45jr" Feb 27 17:43:23 crc kubenswrapper[4752]: I0227 17:43:23.460044 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p45jr" Feb 27 17:43:23 crc kubenswrapper[4752]: I0227 17:43:23.524908 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p45jr" Feb 27 17:43:23 crc kubenswrapper[4752]: I0227 17:43:23.551289 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p45jr" podStartSLOduration=7.061909077 podStartE2EDuration="10.551271063s" podCreationTimestamp="2026-02-27 17:43:13 +0000 UTC" firstStartedPulling="2026-02-27 17:43:14.867598606 +0000 UTC m=+494.774415457" lastFinishedPulling="2026-02-27 17:43:18.356960592 +0000 UTC m=+498.263777443" observedRunningTime="2026-02-27 17:43:18.983315878 +0000 UTC m=+498.890132739" watchObservedRunningTime="2026-02-27 17:43:23.551271063 +0000 UTC m=+503.458087924" Feb 27 17:43:23 crc kubenswrapper[4752]: I0227 17:43:23.565847 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:43:23 crc kubenswrapper[4752]: I0227 17:43:23.690293 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/57573690-e945-43f5-b3ed-e3451f5a8a47-bound-sa-token\") pod \"57573690-e945-43f5-b3ed-e3451f5a8a47\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " Feb 27 17:43:23 crc kubenswrapper[4752]: I0227 17:43:23.690370 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/57573690-e945-43f5-b3ed-e3451f5a8a47-registry-certificates\") pod \"57573690-e945-43f5-b3ed-e3451f5a8a47\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " Feb 27 17:43:23 crc kubenswrapper[4752]: I0227 17:43:23.690431 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/57573690-e945-43f5-b3ed-e3451f5a8a47-ca-trust-extracted\") pod \"57573690-e945-43f5-b3ed-e3451f5a8a47\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " Feb 27 17:43:23 crc kubenswrapper[4752]: I0227 17:43:23.690629 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"57573690-e945-43f5-b3ed-e3451f5a8a47\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " Feb 27 17:43:23 crc kubenswrapper[4752]: I0227 17:43:23.690668 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57573690-e945-43f5-b3ed-e3451f5a8a47-trusted-ca\") pod \"57573690-e945-43f5-b3ed-e3451f5a8a47\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " Feb 27 17:43:23 crc kubenswrapper[4752]: I0227 17:43:23.690713 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/57573690-e945-43f5-b3ed-e3451f5a8a47-registry-tls\") pod \"57573690-e945-43f5-b3ed-e3451f5a8a47\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " Feb 27 17:43:23 crc kubenswrapper[4752]: I0227 17:43:23.690799 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fk4zr\" (UniqueName: \"kubernetes.io/projected/57573690-e945-43f5-b3ed-e3451f5a8a47-kube-api-access-fk4zr\") pod \"57573690-e945-43f5-b3ed-e3451f5a8a47\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " Feb 27 17:43:23 crc kubenswrapper[4752]: I0227 17:43:23.690855 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/57573690-e945-43f5-b3ed-e3451f5a8a47-installation-pull-secrets\") pod \"57573690-e945-43f5-b3ed-e3451f5a8a47\" (UID: \"57573690-e945-43f5-b3ed-e3451f5a8a47\") " Feb 27 17:43:23 crc kubenswrapper[4752]: I0227 17:43:23.691702 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57573690-e945-43f5-b3ed-e3451f5a8a47-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "57573690-e945-43f5-b3ed-e3451f5a8a47" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:43:23 crc kubenswrapper[4752]: I0227 17:43:23.692194 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57573690-e945-43f5-b3ed-e3451f5a8a47-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "57573690-e945-43f5-b3ed-e3451f5a8a47" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:43:23 crc kubenswrapper[4752]: I0227 17:43:23.699625 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57573690-e945-43f5-b3ed-e3451f5a8a47-kube-api-access-fk4zr" (OuterVolumeSpecName: "kube-api-access-fk4zr") pod "57573690-e945-43f5-b3ed-e3451f5a8a47" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47"). InnerVolumeSpecName "kube-api-access-fk4zr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:43:23 crc kubenswrapper[4752]: I0227 17:43:23.700045 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57573690-e945-43f5-b3ed-e3451f5a8a47-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "57573690-e945-43f5-b3ed-e3451f5a8a47" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:43:23 crc kubenswrapper[4752]: I0227 17:43:23.701016 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57573690-e945-43f5-b3ed-e3451f5a8a47-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "57573690-e945-43f5-b3ed-e3451f5a8a47" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:43:23 crc kubenswrapper[4752]: I0227 17:43:23.701649 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "57573690-e945-43f5-b3ed-e3451f5a8a47" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 27 17:43:23 crc kubenswrapper[4752]: I0227 17:43:23.706442 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57573690-e945-43f5-b3ed-e3451f5a8a47-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "57573690-e945-43f5-b3ed-e3451f5a8a47" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 17:43:23 crc kubenswrapper[4752]: I0227 17:43:23.715814 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57573690-e945-43f5-b3ed-e3451f5a8a47-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "57573690-e945-43f5-b3ed-e3451f5a8a47" (UID: "57573690-e945-43f5-b3ed-e3451f5a8a47"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:43:23 crc kubenswrapper[4752]: I0227 17:43:23.792278 4752 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/57573690-e945-43f5-b3ed-e3451f5a8a47-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 27 17:43:23 crc kubenswrapper[4752]: I0227 17:43:23.792332 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fk4zr\" (UniqueName: \"kubernetes.io/projected/57573690-e945-43f5-b3ed-e3451f5a8a47-kube-api-access-fk4zr\") on node \"crc\" DevicePath \"\"" Feb 27 17:43:23 crc kubenswrapper[4752]: I0227 17:43:23.792354 4752 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/57573690-e945-43f5-b3ed-e3451f5a8a47-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 27 17:43:23 crc kubenswrapper[4752]: I0227 17:43:23.792374 4752 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/57573690-e945-43f5-b3ed-e3451f5a8a47-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 27 17:43:23 crc kubenswrapper[4752]: I0227 17:43:23.792392 4752 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/57573690-e945-43f5-b3ed-e3451f5a8a47-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 27 17:43:23 crc kubenswrapper[4752]: I0227 17:43:23.792409 4752 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/57573690-e945-43f5-b3ed-e3451f5a8a47-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 27 17:43:23 crc kubenswrapper[4752]: I0227 17:43:23.792425 4752 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57573690-e945-43f5-b3ed-e3451f5a8a47-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 27 17:43:23 crc kubenswrapper[4752]: I0227 17:43:23.981335 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" event={"ID":"57573690-e945-43f5-b3ed-e3451f5a8a47","Type":"ContainerDied","Data":"637a6aaff9e442b73c8adbb7bedb8cae14905e19dd33712c20caabc6b8f0f12f"} Feb 27 17:43:23 crc kubenswrapper[4752]: I0227 17:43:23.981389 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-r47g5" Feb 27 17:43:23 crc kubenswrapper[4752]: I0227 17:43:23.981445 4752 scope.go:117] "RemoveContainer" containerID="2a00d9700aac7ca41290f95c8c78dedc3add9ca04dc4b893d762f6a1d1026e12" Feb 27 17:43:24 crc kubenswrapper[4752]: I0227 17:43:24.044272 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-r47g5"] Feb 27 17:43:24 crc kubenswrapper[4752]: I0227 17:43:24.052354 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-r47g5"] Feb 27 17:43:24 crc kubenswrapper[4752]: I0227 17:43:24.057413 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p45jr" Feb 27 17:43:24 crc kubenswrapper[4752]: I0227 17:43:24.915804 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57573690-e945-43f5-b3ed-e3451f5a8a47" path="/var/lib/kubelet/pods/57573690-e945-43f5-b3ed-e3451f5a8a47/volumes" Feb 27 17:43:25 crc kubenswrapper[4752]: E0227 17:43:25.758215 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error)" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 27 17:43:25 crc kubenswrapper[4752]: E0227 17:43:25.758669 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2pgtc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-wpsdf_openshift-marketplace(cb10b603-75bd-4c13-b326-bcd1837e25c1): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 17:43:25 crc kubenswrapper[4752]: E0227 17:43:25.760204 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error)\"" pod="openshift-marketplace/redhat-marketplace-wpsdf" podUID="cb10b603-75bd-4c13-b326-bcd1837e25c1" Feb 27 17:43:28 crc kubenswrapper[4752]: I0227 17:43:28.009423 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tv7td" event={"ID":"5295b606-ce89-48c6-809f-36bc6bbfd87f","Type":"ContainerStarted","Data":"792802f13a62bcbabb9c4b92ffce1f15ab4078bcb94c91292232798b0182fd64"} Feb 27 17:43:29 crc kubenswrapper[4752]: I0227 17:43:29.022563 4752 generic.go:334] "Generic (PLEG): container finished" podID="5295b606-ce89-48c6-809f-36bc6bbfd87f" containerID="792802f13a62bcbabb9c4b92ffce1f15ab4078bcb94c91292232798b0182fd64" exitCode=0 Feb 27 17:43:29 crc kubenswrapper[4752]: I0227 17:43:29.022635 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tv7td" event={"ID":"5295b606-ce89-48c6-809f-36bc6bbfd87f","Type":"ContainerDied","Data":"792802f13a62bcbabb9c4b92ffce1f15ab4078bcb94c91292232798b0182fd64"} Feb 27 17:43:31 crc kubenswrapper[4752]: I0227 17:43:31.038918 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tv7td" event={"ID":"5295b606-ce89-48c6-809f-36bc6bbfd87f","Type":"ContainerStarted","Data":"27600691fdcec7c4081a3e6141c04da66896e94d3eae2c32ed4827c99c524ac7"} Feb 27 17:43:31 crc kubenswrapper[4752]: I0227 17:43:31.066697 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tv7td" podStartSLOduration=3.364308335 podStartE2EDuration="19.06667575s" podCreationTimestamp="2026-02-27 17:43:12 +0000 UTC" firstStartedPulling="2026-02-27 17:43:14.864713354 +0000 UTC m=+494.771530235" lastFinishedPulling="2026-02-27 17:43:30.567080759 +0000 UTC m=+510.473897650" observedRunningTime="2026-02-27 17:43:31.064114507 +0000 UTC m=+510.970931368" watchObservedRunningTime="2026-02-27 17:43:31.06667575 +0000 UTC m=+510.973492611" Feb 27 17:43:31 crc kubenswrapper[4752]: I0227 17:43:31.178765 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-s684v" Feb 27 17:43:31 crc kubenswrapper[4752]: I0227 17:43:31.232019 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-s684v" Feb 27 17:43:32 crc kubenswrapper[4752]: I0227 17:43:32.462370 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tv7td" Feb 27 17:43:32 crc kubenswrapper[4752]: I0227 17:43:32.462789 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tv7td" Feb 27 17:43:33 crc kubenswrapper[4752]: I0227 17:43:33.534176 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-tv7td" podUID="5295b606-ce89-48c6-809f-36bc6bbfd87f" containerName="registry-server" probeResult="failure" output=< Feb 27 17:43:33 crc kubenswrapper[4752]: timeout: failed to connect service ":50051" within 1s Feb 27 17:43:33 crc kubenswrapper[4752]: > Feb 27 17:43:36 crc kubenswrapper[4752]: I0227 17:43:36.323806 4752 patch_prober.go:28] interesting pod/machine-config-daemon-cm8wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 17:43:36 crc kubenswrapper[4752]: I0227 17:43:36.324228 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 17:43:36 crc kubenswrapper[4752]: I0227 17:43:36.324295 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" Feb 27 17:43:36 crc kubenswrapper[4752]: I0227 17:43:36.329034 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"048d588cdf52639f640933e2d926a86b51d60c9944af1020f69bdb46dab3553d"} pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 17:43:36 crc kubenswrapper[4752]: I0227 17:43:36.330342 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" containerName="machine-config-daemon" containerID="cri-o://048d588cdf52639f640933e2d926a86b51d60c9944af1020f69bdb46dab3553d" gracePeriod=600 Feb 27 17:43:36 crc kubenswrapper[4752]: E0227 17:43:36.909123 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-wpsdf" podUID="cb10b603-75bd-4c13-b326-bcd1837e25c1" Feb 27 17:43:37 crc kubenswrapper[4752]: I0227 17:43:37.082078 4752 generic.go:334] "Generic (PLEG): container finished" podID="53ce186c-640f-4ade-94e1-587c1440fe87" containerID="048d588cdf52639f640933e2d926a86b51d60c9944af1020f69bdb46dab3553d" exitCode=0 Feb 27 17:43:37 crc kubenswrapper[4752]: I0227 17:43:37.082166 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" event={"ID":"53ce186c-640f-4ade-94e1-587c1440fe87","Type":"ContainerDied","Data":"048d588cdf52639f640933e2d926a86b51d60c9944af1020f69bdb46dab3553d"} Feb 27 17:43:37 crc kubenswrapper[4752]: I0227 17:43:37.082216 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" event={"ID":"53ce186c-640f-4ade-94e1-587c1440fe87","Type":"ContainerStarted","Data":"4e626018c1edbe3730d4f3d103fde91f98edb2e73f244e25466610f806bc6269"} Feb 27 17:43:37 crc kubenswrapper[4752]: I0227 17:43:37.082244 4752 scope.go:117] "RemoveContainer" containerID="4a78e5a0164b37185d7cf25f07c0ea5a1fd6df0d23a1ff173bf085817d2f672f" Feb 27 17:43:42 crc kubenswrapper[4752]: I0227 17:43:42.511494 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tv7td" Feb 27 17:43:42 crc kubenswrapper[4752]: I0227 17:43:42.575136 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tv7td" Feb 27 17:43:53 crc kubenswrapper[4752]: I0227 17:43:53.176096 4752 generic.go:334] "Generic (PLEG): container finished" podID="cb10b603-75bd-4c13-b326-bcd1837e25c1" containerID="ad2e81a94a42a2fd244c135cf59a80a4794844fbb76c2a24f15e1be10607f929" exitCode=0 Feb 27 17:43:53 crc kubenswrapper[4752]: I0227 17:43:53.176196 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpsdf" event={"ID":"cb10b603-75bd-4c13-b326-bcd1837e25c1","Type":"ContainerDied","Data":"ad2e81a94a42a2fd244c135cf59a80a4794844fbb76c2a24f15e1be10607f929"} Feb 27 17:43:54 crc kubenswrapper[4752]: I0227 17:43:54.186609 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpsdf" event={"ID":"cb10b603-75bd-4c13-b326-bcd1837e25c1","Type":"ContainerStarted","Data":"76151f5d52e258308862a861048eb85a26978c34b86769a667a3d4c05b1ade7e"} Feb 27 17:43:54 crc kubenswrapper[4752]: I0227 17:43:54.511134 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wpsdf" podStartSLOduration=3.7510375700000003 podStartE2EDuration="45.511109082s" podCreationTimestamp="2026-02-27 17:43:09 +0000 UTC" firstStartedPulling="2026-02-27 17:43:11.837340444 +0000 UTC m=+491.744157325" lastFinishedPulling="2026-02-27 17:43:53.597411956 +0000 UTC m=+533.504228837" observedRunningTime="2026-02-27 17:43:54.503980874 +0000 UTC m=+534.410797785" watchObservedRunningTime="2026-02-27 17:43:54.511109082 +0000 UTC m=+534.417925973" Feb 27 17:44:00 crc kubenswrapper[4752]: I0227 17:44:00.049013 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wpsdf" Feb 27 17:44:00 crc kubenswrapper[4752]: I0227 17:44:00.049405 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wpsdf" Feb 27 17:44:00 crc kubenswrapper[4752]: I0227 17:44:00.116117 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wpsdf" Feb 27 17:44:00 crc kubenswrapper[4752]: I0227 17:44:00.150631 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536904-mb7cz"] Feb 27 17:44:00 crc kubenswrapper[4752]: E0227 17:44:00.150895 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57573690-e945-43f5-b3ed-e3451f5a8a47" containerName="registry" Feb 27 17:44:00 crc kubenswrapper[4752]: I0227 17:44:00.150954 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="57573690-e945-43f5-b3ed-e3451f5a8a47" containerName="registry" Feb 27 17:44:00 crc kubenswrapper[4752]: I0227 17:44:00.151053 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="57573690-e945-43f5-b3ed-e3451f5a8a47" containerName="registry" Feb 27 17:44:00 crc kubenswrapper[4752]: I0227 17:44:00.151463 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536904-mb7cz" Feb 27 17:44:00 crc kubenswrapper[4752]: I0227 17:44:00.155225 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 17:44:00 crc kubenswrapper[4752]: I0227 17:44:00.155255 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 17:44:00 crc kubenswrapper[4752]: I0227 17:44:00.155464 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wtt6d" Feb 27 17:44:00 crc kubenswrapper[4752]: I0227 17:44:00.173310 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536904-mb7cz"] Feb 27 17:44:00 crc kubenswrapper[4752]: I0227 17:44:00.190118 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72dfc\" (UniqueName: \"kubernetes.io/projected/a91adcd2-9d66-4213-b9d5-09781e0e3401-kube-api-access-72dfc\") pod \"auto-csr-approver-29536904-mb7cz\" (UID: \"a91adcd2-9d66-4213-b9d5-09781e0e3401\") " pod="openshift-infra/auto-csr-approver-29536904-mb7cz" Feb 27 17:44:00 crc kubenswrapper[4752]: I0227 17:44:00.290889 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wpsdf" Feb 27 17:44:00 crc kubenswrapper[4752]: I0227 17:44:00.291062 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72dfc\" (UniqueName: \"kubernetes.io/projected/a91adcd2-9d66-4213-b9d5-09781e0e3401-kube-api-access-72dfc\") pod \"auto-csr-approver-29536904-mb7cz\" (UID: \"a91adcd2-9d66-4213-b9d5-09781e0e3401\") " pod="openshift-infra/auto-csr-approver-29536904-mb7cz" Feb 27 17:44:00 crc kubenswrapper[4752]: I0227 17:44:00.314762 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72dfc\" (UniqueName: \"kubernetes.io/projected/a91adcd2-9d66-4213-b9d5-09781e0e3401-kube-api-access-72dfc\") pod \"auto-csr-approver-29536904-mb7cz\" (UID: \"a91adcd2-9d66-4213-b9d5-09781e0e3401\") " pod="openshift-infra/auto-csr-approver-29536904-mb7cz" Feb 27 17:44:00 crc kubenswrapper[4752]: I0227 17:44:00.468788 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536904-mb7cz" Feb 27 17:44:00 crc kubenswrapper[4752]: I0227 17:44:00.693792 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536904-mb7cz"] Feb 27 17:44:01 crc kubenswrapper[4752]: I0227 17:44:01.251977 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536904-mb7cz" event={"ID":"a91adcd2-9d66-4213-b9d5-09781e0e3401","Type":"ContainerStarted","Data":"f53ee0aaaa2c2d4258c67f4039b6d3bf17b0eaa8a4375b155740f70053238246"} Feb 27 17:44:05 crc kubenswrapper[4752]: E0227 17:44:05.868683 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 27 17:44:05 crc kubenswrapper[4752]: E0227 17:44:05.869030 4752 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 17:44:05 crc kubenswrapper[4752]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 27 17:44:05 crc kubenswrapper[4752]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-72dfc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29536904-mb7cz_openshift-infra(a91adcd2-9d66-4213-b9d5-09781e0e3401): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error) Feb 27 17:44:05 crc kubenswrapper[4752]: > logger="UnhandledError" Feb 27 17:44:05 crc kubenswrapper[4752]: E0227 17:44:05.870237 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)\"" pod="openshift-infra/auto-csr-approver-29536904-mb7cz" podUID="a91adcd2-9d66-4213-b9d5-09781e0e3401" Feb 27 17:44:06 crc kubenswrapper[4752]: E0227 17:44:06.303935 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536904-mb7cz" podUID="a91adcd2-9d66-4213-b9d5-09781e0e3401" Feb 27 17:44:17 crc kubenswrapper[4752]: I0227 17:44:17.910759 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 17:44:18 crc kubenswrapper[4752]: E0227 17:44:18.979693 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 27 17:44:18 crc kubenswrapper[4752]: E0227 17:44:18.979971 4752 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 17:44:18 crc kubenswrapper[4752]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 27 17:44:18 crc kubenswrapper[4752]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-72dfc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29536904-mb7cz_openshift-infra(a91adcd2-9d66-4213-b9d5-09781e0e3401): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error) Feb 27 17:44:18 crc kubenswrapper[4752]: > logger="UnhandledError" Feb 27 17:44:18 crc kubenswrapper[4752]: E0227 17:44:18.981285 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)\"" pod="openshift-infra/auto-csr-approver-29536904-mb7cz" podUID="a91adcd2-9d66-4213-b9d5-09781e0e3401" Feb 27 17:44:32 crc kubenswrapper[4752]: E0227 17:44:32.909200 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536904-mb7cz" podUID="a91adcd2-9d66-4213-b9d5-09781e0e3401" Feb 27 17:44:46 crc kubenswrapper[4752]: E0227 17:44:46.037768 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 27 17:44:46 crc kubenswrapper[4752]: E0227 17:44:46.038416 4752 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 17:44:46 crc kubenswrapper[4752]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 27 17:44:46 crc kubenswrapper[4752]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-72dfc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29536904-mb7cz_openshift-infra(a91adcd2-9d66-4213-b9d5-09781e0e3401): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error) Feb 27 17:44:46 crc kubenswrapper[4752]: > logger="UnhandledError" Feb 27 17:44:46 crc kubenswrapper[4752]: E0227 17:44:46.040011 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)\"" pod="openshift-infra/auto-csr-approver-29536904-mb7cz" podUID="a91adcd2-9d66-4213-b9d5-09781e0e3401" Feb 27 17:45:00 crc kubenswrapper[4752]: I0227 17:45:00.149172 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536905-vpbkt"] Feb 27 17:45:00 crc kubenswrapper[4752]: I0227 17:45:00.151414 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536905-vpbkt" Feb 27 17:45:00 crc kubenswrapper[4752]: I0227 17:45:00.154722 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 27 17:45:00 crc kubenswrapper[4752]: I0227 17:45:00.155383 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 27 17:45:00 crc kubenswrapper[4752]: I0227 17:45:00.169052 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536905-vpbkt"] Feb 27 17:45:00 crc kubenswrapper[4752]: I0227 17:45:00.316484 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/010d4fe8-07db-45bb-b700-711476e949ac-config-volume\") pod \"collect-profiles-29536905-vpbkt\" (UID: \"010d4fe8-07db-45bb-b700-711476e949ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536905-vpbkt" Feb 27 17:45:00 crc kubenswrapper[4752]: I0227 17:45:00.316548 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6cp9\" (UniqueName: \"kubernetes.io/projected/010d4fe8-07db-45bb-b700-711476e949ac-kube-api-access-f6cp9\") pod \"collect-profiles-29536905-vpbkt\" (UID: \"010d4fe8-07db-45bb-b700-711476e949ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536905-vpbkt" Feb 27 17:45:00 crc kubenswrapper[4752]: I0227 17:45:00.316570 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/010d4fe8-07db-45bb-b700-711476e949ac-secret-volume\") pod \"collect-profiles-29536905-vpbkt\" (UID: \"010d4fe8-07db-45bb-b700-711476e949ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536905-vpbkt" Feb 27 17:45:00 crc kubenswrapper[4752]: I0227 17:45:00.418197 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/010d4fe8-07db-45bb-b700-711476e949ac-config-volume\") pod \"collect-profiles-29536905-vpbkt\" (UID: \"010d4fe8-07db-45bb-b700-711476e949ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536905-vpbkt" Feb 27 17:45:00 crc kubenswrapper[4752]: I0227 17:45:00.418738 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6cp9\" (UniqueName: \"kubernetes.io/projected/010d4fe8-07db-45bb-b700-711476e949ac-kube-api-access-f6cp9\") pod \"collect-profiles-29536905-vpbkt\" (UID: \"010d4fe8-07db-45bb-b700-711476e949ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536905-vpbkt" Feb 27 17:45:00 crc kubenswrapper[4752]: I0227 17:45:00.418997 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/010d4fe8-07db-45bb-b700-711476e949ac-secret-volume\") pod \"collect-profiles-29536905-vpbkt\" (UID: \"010d4fe8-07db-45bb-b700-711476e949ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536905-vpbkt" Feb 27 17:45:00 crc kubenswrapper[4752]: I0227 17:45:00.420284 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/010d4fe8-07db-45bb-b700-711476e949ac-config-volume\") pod \"collect-profiles-29536905-vpbkt\" (UID: \"010d4fe8-07db-45bb-b700-711476e949ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536905-vpbkt" Feb 27 17:45:00 crc kubenswrapper[4752]: I0227 17:45:00.429810 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/010d4fe8-07db-45bb-b700-711476e949ac-secret-volume\") pod \"collect-profiles-29536905-vpbkt\" (UID: \"010d4fe8-07db-45bb-b700-711476e949ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536905-vpbkt" Feb 27 17:45:00 crc kubenswrapper[4752]: I0227 17:45:00.438993 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6cp9\" (UniqueName: \"kubernetes.io/projected/010d4fe8-07db-45bb-b700-711476e949ac-kube-api-access-f6cp9\") pod \"collect-profiles-29536905-vpbkt\" (UID: \"010d4fe8-07db-45bb-b700-711476e949ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536905-vpbkt" Feb 27 17:45:00 crc kubenswrapper[4752]: I0227 17:45:00.480022 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536905-vpbkt" Feb 27 17:45:00 crc kubenswrapper[4752]: I0227 17:45:00.904682 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536905-vpbkt"] Feb 27 17:45:00 crc kubenswrapper[4752]: W0227 17:45:00.910121 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod010d4fe8_07db_45bb_b700_711476e949ac.slice/crio-7ad17c7d36d6ee80049ffbbe051e2e3e360586e5892b349cb009787af82e7ad2 WatchSource:0}: Error finding container 7ad17c7d36d6ee80049ffbbe051e2e3e360586e5892b349cb009787af82e7ad2: Status 404 returned error can't find the container with id 7ad17c7d36d6ee80049ffbbe051e2e3e360586e5892b349cb009787af82e7ad2 Feb 27 17:45:00 crc kubenswrapper[4752]: E0227 17:45:00.922938 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536904-mb7cz" podUID="a91adcd2-9d66-4213-b9d5-09781e0e3401" Feb 27 17:45:01 crc kubenswrapper[4752]: I0227 17:45:01.682938 4752 generic.go:334] "Generic (PLEG): container finished" podID="010d4fe8-07db-45bb-b700-711476e949ac" containerID="ea682b4c316f680bf361160c8599778a69c203e2fabb0abf8092930ed4e7f7c4" exitCode=0 Feb 27 17:45:01 crc kubenswrapper[4752]: I0227 17:45:01.683008 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29536905-vpbkt" event={"ID":"010d4fe8-07db-45bb-b700-711476e949ac","Type":"ContainerDied","Data":"ea682b4c316f680bf361160c8599778a69c203e2fabb0abf8092930ed4e7f7c4"} Feb 27 17:45:01 crc kubenswrapper[4752]: I0227 17:45:01.683302 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29536905-vpbkt" event={"ID":"010d4fe8-07db-45bb-b700-711476e949ac","Type":"ContainerStarted","Data":"7ad17c7d36d6ee80049ffbbe051e2e3e360586e5892b349cb009787af82e7ad2"} Feb 27 17:45:02 crc kubenswrapper[4752]: I0227 17:45:02.923706 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536905-vpbkt" Feb 27 17:45:02 crc kubenswrapper[4752]: I0227 17:45:02.986630 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/010d4fe8-07db-45bb-b700-711476e949ac-config-volume\") pod \"010d4fe8-07db-45bb-b700-711476e949ac\" (UID: \"010d4fe8-07db-45bb-b700-711476e949ac\") " Feb 27 17:45:02 crc kubenswrapper[4752]: I0227 17:45:02.986705 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/010d4fe8-07db-45bb-b700-711476e949ac-secret-volume\") pod \"010d4fe8-07db-45bb-b700-711476e949ac\" (UID: \"010d4fe8-07db-45bb-b700-711476e949ac\") " Feb 27 17:45:02 crc kubenswrapper[4752]: I0227 17:45:02.986736 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6cp9\" (UniqueName: \"kubernetes.io/projected/010d4fe8-07db-45bb-b700-711476e949ac-kube-api-access-f6cp9\") pod \"010d4fe8-07db-45bb-b700-711476e949ac\" (UID: \"010d4fe8-07db-45bb-b700-711476e949ac\") " Feb 27 17:45:02 crc kubenswrapper[4752]: I0227 17:45:02.987727 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/010d4fe8-07db-45bb-b700-711476e949ac-config-volume" (OuterVolumeSpecName: "config-volume") pod "010d4fe8-07db-45bb-b700-711476e949ac" (UID: "010d4fe8-07db-45bb-b700-711476e949ac"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:45:02 crc kubenswrapper[4752]: I0227 17:45:02.992200 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/010d4fe8-07db-45bb-b700-711476e949ac-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "010d4fe8-07db-45bb-b700-711476e949ac" (UID: "010d4fe8-07db-45bb-b700-711476e949ac"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:45:02 crc kubenswrapper[4752]: I0227 17:45:02.993804 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/010d4fe8-07db-45bb-b700-711476e949ac-kube-api-access-f6cp9" (OuterVolumeSpecName: "kube-api-access-f6cp9") pod "010d4fe8-07db-45bb-b700-711476e949ac" (UID: "010d4fe8-07db-45bb-b700-711476e949ac"). InnerVolumeSpecName "kube-api-access-f6cp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:45:03 crc kubenswrapper[4752]: I0227 17:45:03.088256 4752 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/010d4fe8-07db-45bb-b700-711476e949ac-config-volume\") on node \"crc\" DevicePath \"\"" Feb 27 17:45:03 crc kubenswrapper[4752]: I0227 17:45:03.088286 4752 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/010d4fe8-07db-45bb-b700-711476e949ac-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 27 17:45:03 crc kubenswrapper[4752]: I0227 17:45:03.088296 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6cp9\" (UniqueName: \"kubernetes.io/projected/010d4fe8-07db-45bb-b700-711476e949ac-kube-api-access-f6cp9\") on node \"crc\" DevicePath \"\"" Feb 27 17:45:03 crc kubenswrapper[4752]: I0227 17:45:03.700983 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29536905-vpbkt" event={"ID":"010d4fe8-07db-45bb-b700-711476e949ac","Type":"ContainerDied","Data":"7ad17c7d36d6ee80049ffbbe051e2e3e360586e5892b349cb009787af82e7ad2"} Feb 27 17:45:03 crc kubenswrapper[4752]: I0227 17:45:03.701044 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536905-vpbkt" Feb 27 17:45:03 crc kubenswrapper[4752]: I0227 17:45:03.701057 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ad17c7d36d6ee80049ffbbe051e2e3e360586e5892b349cb009787af82e7ad2" Feb 27 17:45:13 crc kubenswrapper[4752]: E0227 17:45:13.909021 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536904-mb7cz" podUID="a91adcd2-9d66-4213-b9d5-09781e0e3401" Feb 27 17:45:25 crc kubenswrapper[4752]: E0227 17:45:25.910708 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536904-mb7cz" podUID="a91adcd2-9d66-4213-b9d5-09781e0e3401" Feb 27 17:45:36 crc kubenswrapper[4752]: I0227 17:45:36.323852 4752 patch_prober.go:28] interesting pod/machine-config-daemon-cm8wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 17:45:36 crc kubenswrapper[4752]: I0227 17:45:36.324394 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 17:45:38 crc kubenswrapper[4752]: I0227 17:45:38.938022 4752 generic.go:334] "Generic (PLEG): container finished" podID="a91adcd2-9d66-4213-b9d5-09781e0e3401" containerID="2fc07b2dafe4b0a53be5ef4318b5c67a85c4e8d2653bc57970bcfb4d0a6f3496" exitCode=0 Feb 27 17:45:38 crc kubenswrapper[4752]: I0227 17:45:38.938308 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536904-mb7cz" event={"ID":"a91adcd2-9d66-4213-b9d5-09781e0e3401","Type":"ContainerDied","Data":"2fc07b2dafe4b0a53be5ef4318b5c67a85c4e8d2653bc57970bcfb4d0a6f3496"} Feb 27 17:45:40 crc kubenswrapper[4752]: I0227 17:45:40.264249 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536904-mb7cz" Feb 27 17:45:40 crc kubenswrapper[4752]: I0227 17:45:40.318406 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72dfc\" (UniqueName: \"kubernetes.io/projected/a91adcd2-9d66-4213-b9d5-09781e0e3401-kube-api-access-72dfc\") pod \"a91adcd2-9d66-4213-b9d5-09781e0e3401\" (UID: \"a91adcd2-9d66-4213-b9d5-09781e0e3401\") " Feb 27 17:45:40 crc kubenswrapper[4752]: I0227 17:45:40.326556 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a91adcd2-9d66-4213-b9d5-09781e0e3401-kube-api-access-72dfc" (OuterVolumeSpecName: "kube-api-access-72dfc") pod "a91adcd2-9d66-4213-b9d5-09781e0e3401" (UID: "a91adcd2-9d66-4213-b9d5-09781e0e3401"). InnerVolumeSpecName "kube-api-access-72dfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:45:40 crc kubenswrapper[4752]: I0227 17:45:40.419648 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72dfc\" (UniqueName: \"kubernetes.io/projected/a91adcd2-9d66-4213-b9d5-09781e0e3401-kube-api-access-72dfc\") on node \"crc\" DevicePath \"\"" Feb 27 17:45:40 crc kubenswrapper[4752]: I0227 17:45:40.955768 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536904-mb7cz" event={"ID":"a91adcd2-9d66-4213-b9d5-09781e0e3401","Type":"ContainerDied","Data":"f53ee0aaaa2c2d4258c67f4039b6d3bf17b0eaa8a4375b155740f70053238246"} Feb 27 17:45:40 crc kubenswrapper[4752]: I0227 17:45:40.955965 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f53ee0aaaa2c2d4258c67f4039b6d3bf17b0eaa8a4375b155740f70053238246" Feb 27 17:45:40 crc kubenswrapper[4752]: I0227 17:45:40.955880 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536904-mb7cz" Feb 27 17:45:41 crc kubenswrapper[4752]: I0227 17:45:41.332010 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536898-598km"] Feb 27 17:45:41 crc kubenswrapper[4752]: I0227 17:45:41.336489 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536898-598km"] Feb 27 17:45:42 crc kubenswrapper[4752]: I0227 17:45:42.919651 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc36acda-9447-479d-b741-c063ecb91f3e" path="/var/lib/kubelet/pods/cc36acda-9447-479d-b741-c063ecb91f3e/volumes" Feb 27 17:46:00 crc kubenswrapper[4752]: I0227 17:46:00.154590 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536906-xdhv9"] Feb 27 17:46:00 crc kubenswrapper[4752]: E0227 17:46:00.155425 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="010d4fe8-07db-45bb-b700-711476e949ac" containerName="collect-profiles" Feb 27 17:46:00 crc kubenswrapper[4752]: I0227 17:46:00.155446 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="010d4fe8-07db-45bb-b700-711476e949ac" containerName="collect-profiles" Feb 27 17:46:00 crc kubenswrapper[4752]: E0227 17:46:00.155480 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a91adcd2-9d66-4213-b9d5-09781e0e3401" containerName="oc" Feb 27 17:46:00 crc kubenswrapper[4752]: I0227 17:46:00.155492 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="a91adcd2-9d66-4213-b9d5-09781e0e3401" containerName="oc" Feb 27 17:46:00 crc kubenswrapper[4752]: I0227 17:46:00.155665 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="a91adcd2-9d66-4213-b9d5-09781e0e3401" containerName="oc" Feb 27 17:46:00 crc kubenswrapper[4752]: I0227 17:46:00.155689 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="010d4fe8-07db-45bb-b700-711476e949ac" containerName="collect-profiles" Feb 27 17:46:00 crc kubenswrapper[4752]: I0227 17:46:00.156318 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536906-xdhv9" Feb 27 17:46:00 crc kubenswrapper[4752]: I0227 17:46:00.159249 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wtt6d" Feb 27 17:46:00 crc kubenswrapper[4752]: I0227 17:46:00.161835 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 17:46:00 crc kubenswrapper[4752]: I0227 17:46:00.166520 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536906-xdhv9"] Feb 27 17:46:00 crc kubenswrapper[4752]: I0227 17:46:00.168043 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 17:46:00 crc kubenswrapper[4752]: I0227 17:46:00.182282 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m4ns\" (UniqueName: \"kubernetes.io/projected/90b0bdc4-d072-4dff-ab6a-c4b02431d46c-kube-api-access-7m4ns\") pod \"auto-csr-approver-29536906-xdhv9\" (UID: \"90b0bdc4-d072-4dff-ab6a-c4b02431d46c\") " pod="openshift-infra/auto-csr-approver-29536906-xdhv9" Feb 27 17:46:00 crc kubenswrapper[4752]: I0227 17:46:00.283047 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m4ns\" (UniqueName: \"kubernetes.io/projected/90b0bdc4-d072-4dff-ab6a-c4b02431d46c-kube-api-access-7m4ns\") pod \"auto-csr-approver-29536906-xdhv9\" (UID: \"90b0bdc4-d072-4dff-ab6a-c4b02431d46c\") " pod="openshift-infra/auto-csr-approver-29536906-xdhv9" Feb 27 17:46:00 crc kubenswrapper[4752]: I0227 17:46:00.306197 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m4ns\" (UniqueName: \"kubernetes.io/projected/90b0bdc4-d072-4dff-ab6a-c4b02431d46c-kube-api-access-7m4ns\") pod \"auto-csr-approver-29536906-xdhv9\" (UID: \"90b0bdc4-d072-4dff-ab6a-c4b02431d46c\") " pod="openshift-infra/auto-csr-approver-29536906-xdhv9" Feb 27 17:46:00 crc kubenswrapper[4752]: I0227 17:46:00.480761 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536906-xdhv9" Feb 27 17:46:00 crc kubenswrapper[4752]: I0227 17:46:00.717421 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536906-xdhv9"] Feb 27 17:46:01 crc kubenswrapper[4752]: I0227 17:46:01.084361 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536906-xdhv9" event={"ID":"90b0bdc4-d072-4dff-ab6a-c4b02431d46c","Type":"ContainerStarted","Data":"1a77326c94bd76c949b9bcdc13ec89d7a2ec5d31ac0838ada783ef8ef276e530"} Feb 27 17:46:05 crc kubenswrapper[4752]: I0227 17:46:05.112943 4752 generic.go:334] "Generic (PLEG): container finished" podID="90b0bdc4-d072-4dff-ab6a-c4b02431d46c" containerID="43ed2cb25ef8cbe5b9ae4d0d046dfce4e123bd2d0f29b45c8e6c1003cfd24efa" exitCode=0 Feb 27 17:46:05 crc kubenswrapper[4752]: I0227 17:46:05.113380 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536906-xdhv9" event={"ID":"90b0bdc4-d072-4dff-ab6a-c4b02431d46c","Type":"ContainerDied","Data":"43ed2cb25ef8cbe5b9ae4d0d046dfce4e123bd2d0f29b45c8e6c1003cfd24efa"} Feb 27 17:46:06 crc kubenswrapper[4752]: I0227 17:46:06.324232 4752 patch_prober.go:28] interesting pod/machine-config-daemon-cm8wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 17:46:06 crc kubenswrapper[4752]: I0227 17:46:06.324604 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 17:46:06 crc kubenswrapper[4752]: I0227 17:46:06.401791 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536906-xdhv9" Feb 27 17:46:06 crc kubenswrapper[4752]: I0227 17:46:06.462214 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7m4ns\" (UniqueName: \"kubernetes.io/projected/90b0bdc4-d072-4dff-ab6a-c4b02431d46c-kube-api-access-7m4ns\") pod \"90b0bdc4-d072-4dff-ab6a-c4b02431d46c\" (UID: \"90b0bdc4-d072-4dff-ab6a-c4b02431d46c\") " Feb 27 17:46:06 crc kubenswrapper[4752]: I0227 17:46:06.467428 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90b0bdc4-d072-4dff-ab6a-c4b02431d46c-kube-api-access-7m4ns" (OuterVolumeSpecName: "kube-api-access-7m4ns") pod "90b0bdc4-d072-4dff-ab6a-c4b02431d46c" (UID: "90b0bdc4-d072-4dff-ab6a-c4b02431d46c"). InnerVolumeSpecName "kube-api-access-7m4ns". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:46:06 crc kubenswrapper[4752]: I0227 17:46:06.564370 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7m4ns\" (UniqueName: \"kubernetes.io/projected/90b0bdc4-d072-4dff-ab6a-c4b02431d46c-kube-api-access-7m4ns\") on node \"crc\" DevicePath \"\"" Feb 27 17:46:07 crc kubenswrapper[4752]: I0227 17:46:07.127698 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536906-xdhv9" event={"ID":"90b0bdc4-d072-4dff-ab6a-c4b02431d46c","Type":"ContainerDied","Data":"1a77326c94bd76c949b9bcdc13ec89d7a2ec5d31ac0838ada783ef8ef276e530"} Feb 27 17:46:07 crc kubenswrapper[4752]: I0227 17:46:07.127988 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a77326c94bd76c949b9bcdc13ec89d7a2ec5d31ac0838ada783ef8ef276e530" Feb 27 17:46:07 crc kubenswrapper[4752]: I0227 17:46:07.127852 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536906-xdhv9" Feb 27 17:46:07 crc kubenswrapper[4752]: I0227 17:46:07.463604 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536900-5b8dn"] Feb 27 17:46:07 crc kubenswrapper[4752]: I0227 17:46:07.467538 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536900-5b8dn"] Feb 27 17:46:08 crc kubenswrapper[4752]: I0227 17:46:08.918571 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c7d2d1c-023b-43e1-9015-5b572f4648cf" path="/var/lib/kubelet/pods/4c7d2d1c-023b-43e1-9015-5b572f4648cf/volumes" Feb 27 17:46:36 crc kubenswrapper[4752]: I0227 17:46:36.323975 4752 patch_prober.go:28] interesting pod/machine-config-daemon-cm8wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 17:46:36 crc kubenswrapper[4752]: I0227 17:46:36.324657 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 17:46:36 crc kubenswrapper[4752]: I0227 17:46:36.324731 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" Feb 27 17:46:36 crc kubenswrapper[4752]: I0227 17:46:36.325722 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4e626018c1edbe3730d4f3d103fde91f98edb2e73f244e25466610f806bc6269"} pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 17:46:36 crc kubenswrapper[4752]: I0227 17:46:36.325921 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" containerName="machine-config-daemon" containerID="cri-o://4e626018c1edbe3730d4f3d103fde91f98edb2e73f244e25466610f806bc6269" gracePeriod=600 Feb 27 17:46:37 crc kubenswrapper[4752]: I0227 17:46:37.373976 4752 generic.go:334] "Generic (PLEG): container finished" podID="53ce186c-640f-4ade-94e1-587c1440fe87" containerID="4e626018c1edbe3730d4f3d103fde91f98edb2e73f244e25466610f806bc6269" exitCode=0 Feb 27 17:46:37 crc kubenswrapper[4752]: I0227 17:46:37.374094 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" event={"ID":"53ce186c-640f-4ade-94e1-587c1440fe87","Type":"ContainerDied","Data":"4e626018c1edbe3730d4f3d103fde91f98edb2e73f244e25466610f806bc6269"} Feb 27 17:46:37 crc kubenswrapper[4752]: I0227 17:46:37.374755 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" event={"ID":"53ce186c-640f-4ade-94e1-587c1440fe87","Type":"ContainerStarted","Data":"a53f865de5bd7bff88a289306c2cd6d9f814402e7d74b87753c7f92b7f4a7a83"} Feb 27 17:46:37 crc kubenswrapper[4752]: I0227 17:46:37.374808 4752 scope.go:117] "RemoveContainer" containerID="048d588cdf52639f640933e2d926a86b51d60c9944af1020f69bdb46dab3553d" Feb 27 17:47:02 crc kubenswrapper[4752]: I0227 17:47:02.823033 4752 scope.go:117] "RemoveContainer" containerID="6a5e666e4d0f413f0dd7cd7a6f980f7e2bd91a7aa2dcdfddb7d828703dd53103" Feb 27 17:48:00 crc kubenswrapper[4752]: I0227 17:48:00.154258 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536908-zvbsb"] Feb 27 17:48:00 crc kubenswrapper[4752]: E0227 17:48:00.155252 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90b0bdc4-d072-4dff-ab6a-c4b02431d46c" containerName="oc" Feb 27 17:48:00 crc kubenswrapper[4752]: I0227 17:48:00.155275 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="90b0bdc4-d072-4dff-ab6a-c4b02431d46c" containerName="oc" Feb 27 17:48:00 crc kubenswrapper[4752]: I0227 17:48:00.155460 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="90b0bdc4-d072-4dff-ab6a-c4b02431d46c" containerName="oc" Feb 27 17:48:00 crc kubenswrapper[4752]: I0227 17:48:00.156039 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536908-zvbsb" Feb 27 17:48:00 crc kubenswrapper[4752]: I0227 17:48:00.159489 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wtt6d" Feb 27 17:48:00 crc kubenswrapper[4752]: I0227 17:48:00.160053 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 17:48:00 crc kubenswrapper[4752]: I0227 17:48:00.160570 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536908-zvbsb"] Feb 27 17:48:00 crc kubenswrapper[4752]: I0227 17:48:00.160931 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 17:48:00 crc kubenswrapper[4752]: I0227 17:48:00.327432 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbqcp\" (UniqueName: \"kubernetes.io/projected/4babdb15-835b-4965-9af0-4a697c85f645-kube-api-access-mbqcp\") pod \"auto-csr-approver-29536908-zvbsb\" (UID: \"4babdb15-835b-4965-9af0-4a697c85f645\") " pod="openshift-infra/auto-csr-approver-29536908-zvbsb" Feb 27 17:48:00 crc kubenswrapper[4752]: I0227 17:48:00.428338 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbqcp\" (UniqueName: \"kubernetes.io/projected/4babdb15-835b-4965-9af0-4a697c85f645-kube-api-access-mbqcp\") pod \"auto-csr-approver-29536908-zvbsb\" (UID: \"4babdb15-835b-4965-9af0-4a697c85f645\") " pod="openshift-infra/auto-csr-approver-29536908-zvbsb" Feb 27 17:48:00 crc kubenswrapper[4752]: I0227 17:48:00.453065 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbqcp\" (UniqueName: \"kubernetes.io/projected/4babdb15-835b-4965-9af0-4a697c85f645-kube-api-access-mbqcp\") pod \"auto-csr-approver-29536908-zvbsb\" (UID: \"4babdb15-835b-4965-9af0-4a697c85f645\") " pod="openshift-infra/auto-csr-approver-29536908-zvbsb" Feb 27 17:48:00 crc kubenswrapper[4752]: I0227 17:48:00.479886 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536908-zvbsb" Feb 27 17:48:00 crc kubenswrapper[4752]: I0227 17:48:00.725850 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536908-zvbsb"] Feb 27 17:48:00 crc kubenswrapper[4752]: I0227 17:48:00.919257 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536908-zvbsb" event={"ID":"4babdb15-835b-4965-9af0-4a697c85f645","Type":"ContainerStarted","Data":"ded5fe56ed707bec1cabd17f2d8b407c4e8d22f5642c6b5960a86c1d9a6c5f7d"} Feb 27 17:48:01 crc kubenswrapper[4752]: E0227 17:48:01.687326 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 27 17:48:01 crc kubenswrapper[4752]: E0227 17:48:01.688502 4752 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 17:48:01 crc kubenswrapper[4752]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 27 17:48:01 crc kubenswrapper[4752]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mbqcp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29536908-zvbsb_openshift-infra(4babdb15-835b-4965-9af0-4a697c85f645): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error) Feb 27 17:48:01 crc kubenswrapper[4752]: > logger="UnhandledError" Feb 27 17:48:01 crc kubenswrapper[4752]: E0227 17:48:01.689719 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)\"" pod="openshift-infra/auto-csr-approver-29536908-zvbsb" podUID="4babdb15-835b-4965-9af0-4a697c85f645" Feb 27 17:48:01 crc kubenswrapper[4752]: E0227 17:48:01.928206 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536908-zvbsb" podUID="4babdb15-835b-4965-9af0-4a697c85f645" Feb 27 17:48:02 crc kubenswrapper[4752]: I0227 17:48:02.876250 4752 scope.go:117] "RemoveContainer" containerID="ac44873d4f7c3d3f68400f980f54120965c4903584aae35e308bc7e3fc7a8c4b" Feb 27 17:48:13 crc kubenswrapper[4752]: E0227 17:48:13.780324 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 27 17:48:13 crc kubenswrapper[4752]: E0227 17:48:13.781138 4752 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 17:48:13 crc kubenswrapper[4752]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 27 17:48:13 crc kubenswrapper[4752]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mbqcp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29536908-zvbsb_openshift-infra(4babdb15-835b-4965-9af0-4a697c85f645): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error) Feb 27 17:48:13 crc kubenswrapper[4752]: > logger="UnhandledError" Feb 27 17:48:13 crc kubenswrapper[4752]: E0227 17:48:13.782322 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)\"" pod="openshift-infra/auto-csr-approver-29536908-zvbsb" podUID="4babdb15-835b-4965-9af0-4a697c85f645" Feb 27 17:48:14 crc kubenswrapper[4752]: I0227 17:48:14.994434 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-2vhjd"] Feb 27 17:48:14 crc kubenswrapper[4752]: I0227 17:48:14.995534 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-2vhjd" Feb 27 17:48:14 crc kubenswrapper[4752]: I0227 17:48:14.997102 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 27 17:48:14 crc kubenswrapper[4752]: I0227 17:48:14.997587 4752 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-rlwrx" Feb 27 17:48:14 crc kubenswrapper[4752]: I0227 17:48:14.997822 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 27 17:48:15 crc kubenswrapper[4752]: I0227 17:48:15.007561 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-jm9qg"] Feb 27 17:48:15 crc kubenswrapper[4752]: I0227 17:48:15.008495 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-jm9qg" Feb 27 17:48:15 crc kubenswrapper[4752]: I0227 17:48:15.015915 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-2vhjd"] Feb 27 17:48:15 crc kubenswrapper[4752]: I0227 17:48:15.016031 4752 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-tg6xx" Feb 27 17:48:15 crc kubenswrapper[4752]: I0227 17:48:15.027779 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-jm9qg"] Feb 27 17:48:15 crc kubenswrapper[4752]: I0227 17:48:15.031708 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-xvqws"] Feb 27 17:48:15 crc kubenswrapper[4752]: I0227 17:48:15.032560 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-xvqws" Feb 27 17:48:15 crc kubenswrapper[4752]: I0227 17:48:15.034561 4752 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-qrxr4" Feb 27 17:48:15 crc kubenswrapper[4752]: I0227 17:48:15.041046 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-xvqws"] Feb 27 17:48:15 crc kubenswrapper[4752]: I0227 17:48:15.141234 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxkcl\" (UniqueName: \"kubernetes.io/projected/a7ea8051-fa57-4c00-a8f8-2f4f696701d4-kube-api-access-hxkcl\") pod \"cert-manager-webhook-687f57d79b-xvqws\" (UID: \"a7ea8051-fa57-4c00-a8f8-2f4f696701d4\") " pod="cert-manager/cert-manager-webhook-687f57d79b-xvqws" Feb 27 17:48:15 crc kubenswrapper[4752]: I0227 17:48:15.141818 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2x4d\" (UniqueName: \"kubernetes.io/projected/f225bed8-10a2-4f7c-b1fa-cbd00a97e654-kube-api-access-d2x4d\") pod \"cert-manager-cainjector-cf98fcc89-jm9qg\" (UID: \"f225bed8-10a2-4f7c-b1fa-cbd00a97e654\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-jm9qg" Feb 27 17:48:15 crc kubenswrapper[4752]: I0227 17:48:15.141895 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mj7w\" (UniqueName: \"kubernetes.io/projected/834711ff-fc4f-4160-b828-c695168e91f0-kube-api-access-4mj7w\") pod \"cert-manager-858654f9db-2vhjd\" (UID: \"834711ff-fc4f-4160-b828-c695168e91f0\") " pod="cert-manager/cert-manager-858654f9db-2vhjd" Feb 27 17:48:15 crc kubenswrapper[4752]: I0227 17:48:15.243219 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxkcl\" (UniqueName: \"kubernetes.io/projected/a7ea8051-fa57-4c00-a8f8-2f4f696701d4-kube-api-access-hxkcl\") pod \"cert-manager-webhook-687f57d79b-xvqws\" (UID: \"a7ea8051-fa57-4c00-a8f8-2f4f696701d4\") " pod="cert-manager/cert-manager-webhook-687f57d79b-xvqws" Feb 27 17:48:15 crc kubenswrapper[4752]: I0227 17:48:15.243345 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2x4d\" (UniqueName: \"kubernetes.io/projected/f225bed8-10a2-4f7c-b1fa-cbd00a97e654-kube-api-access-d2x4d\") pod \"cert-manager-cainjector-cf98fcc89-jm9qg\" (UID: \"f225bed8-10a2-4f7c-b1fa-cbd00a97e654\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-jm9qg" Feb 27 17:48:15 crc kubenswrapper[4752]: I0227 17:48:15.243417 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mj7w\" (UniqueName: \"kubernetes.io/projected/834711ff-fc4f-4160-b828-c695168e91f0-kube-api-access-4mj7w\") pod \"cert-manager-858654f9db-2vhjd\" (UID: \"834711ff-fc4f-4160-b828-c695168e91f0\") " pod="cert-manager/cert-manager-858654f9db-2vhjd" Feb 27 17:48:15 crc kubenswrapper[4752]: I0227 17:48:15.269706 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mj7w\" (UniqueName: \"kubernetes.io/projected/834711ff-fc4f-4160-b828-c695168e91f0-kube-api-access-4mj7w\") pod \"cert-manager-858654f9db-2vhjd\" (UID: \"834711ff-fc4f-4160-b828-c695168e91f0\") " pod="cert-manager/cert-manager-858654f9db-2vhjd" Feb 27 17:48:15 crc kubenswrapper[4752]: I0227 17:48:15.273374 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxkcl\" (UniqueName: \"kubernetes.io/projected/a7ea8051-fa57-4c00-a8f8-2f4f696701d4-kube-api-access-hxkcl\") pod \"cert-manager-webhook-687f57d79b-xvqws\" (UID: \"a7ea8051-fa57-4c00-a8f8-2f4f696701d4\") " pod="cert-manager/cert-manager-webhook-687f57d79b-xvqws" Feb 27 17:48:15 crc kubenswrapper[4752]: I0227 17:48:15.275712 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2x4d\" (UniqueName: \"kubernetes.io/projected/f225bed8-10a2-4f7c-b1fa-cbd00a97e654-kube-api-access-d2x4d\") pod \"cert-manager-cainjector-cf98fcc89-jm9qg\" (UID: \"f225bed8-10a2-4f7c-b1fa-cbd00a97e654\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-jm9qg" Feb 27 17:48:15 crc kubenswrapper[4752]: I0227 17:48:15.311098 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-2vhjd" Feb 27 17:48:15 crc kubenswrapper[4752]: I0227 17:48:15.331442 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-jm9qg" Feb 27 17:48:15 crc kubenswrapper[4752]: I0227 17:48:15.346916 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-xvqws" Feb 27 17:48:15 crc kubenswrapper[4752]: I0227 17:48:15.634330 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-xvqws"] Feb 27 17:48:15 crc kubenswrapper[4752]: W0227 17:48:15.643988 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7ea8051_fa57_4c00_a8f8_2f4f696701d4.slice/crio-1228afe481d7e2dcfca67aabf1139118d33261430cc7b36dc68cec5f027361af WatchSource:0}: Error finding container 1228afe481d7e2dcfca67aabf1139118d33261430cc7b36dc68cec5f027361af: Status 404 returned error can't find the container with id 1228afe481d7e2dcfca67aabf1139118d33261430cc7b36dc68cec5f027361af Feb 27 17:48:15 crc kubenswrapper[4752]: I0227 17:48:15.759840 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-2vhjd"] Feb 27 17:48:15 crc kubenswrapper[4752]: W0227 17:48:15.762574 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod834711ff_fc4f_4160_b828_c695168e91f0.slice/crio-27342157d315d019a767ceed6c3848213a4b773ae0926d4a43ca2f88dc1791ab WatchSource:0}: Error finding container 27342157d315d019a767ceed6c3848213a4b773ae0926d4a43ca2f88dc1791ab: Status 404 returned error can't find the container with id 27342157d315d019a767ceed6c3848213a4b773ae0926d4a43ca2f88dc1791ab Feb 27 17:48:15 crc kubenswrapper[4752]: I0227 17:48:15.900903 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-jm9qg"] Feb 27 17:48:15 crc kubenswrapper[4752]: W0227 17:48:15.906682 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf225bed8_10a2_4f7c_b1fa_cbd00a97e654.slice/crio-91d0634f37f7b4c5b70a7be027510eea041b2a2c5a254ef110ab7c69754a7fcb WatchSource:0}: Error finding container 91d0634f37f7b4c5b70a7be027510eea041b2a2c5a254ef110ab7c69754a7fcb: Status 404 returned error can't find the container with id 91d0634f37f7b4c5b70a7be027510eea041b2a2c5a254ef110ab7c69754a7fcb Feb 27 17:48:16 crc kubenswrapper[4752]: I0227 17:48:16.018655 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-jm9qg" event={"ID":"f225bed8-10a2-4f7c-b1fa-cbd00a97e654","Type":"ContainerStarted","Data":"91d0634f37f7b4c5b70a7be027510eea041b2a2c5a254ef110ab7c69754a7fcb"} Feb 27 17:48:16 crc kubenswrapper[4752]: I0227 17:48:16.021272 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-2vhjd" event={"ID":"834711ff-fc4f-4160-b828-c695168e91f0","Type":"ContainerStarted","Data":"27342157d315d019a767ceed6c3848213a4b773ae0926d4a43ca2f88dc1791ab"} Feb 27 17:48:16 crc kubenswrapper[4752]: I0227 17:48:16.022865 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-xvqws" event={"ID":"a7ea8051-fa57-4c00-a8f8-2f4f696701d4","Type":"ContainerStarted","Data":"1228afe481d7e2dcfca67aabf1139118d33261430cc7b36dc68cec5f027361af"} Feb 27 17:48:20 crc kubenswrapper[4752]: I0227 17:48:20.047721 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-2vhjd" event={"ID":"834711ff-fc4f-4160-b828-c695168e91f0","Type":"ContainerStarted","Data":"2bcdc2f483b219ad13e0cd8cb6ca5b2800cc593d6db59ec298282283d57e0842"} Feb 27 17:48:20 crc kubenswrapper[4752]: I0227 17:48:20.052372 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-xvqws" event={"ID":"a7ea8051-fa57-4c00-a8f8-2f4f696701d4","Type":"ContainerStarted","Data":"b610de3dbdca0614d448dccccaf2004bcc939e40aab96689f182556c2a16aa4a"} Feb 27 17:48:20 crc kubenswrapper[4752]: I0227 17:48:20.052692 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-xvqws" Feb 27 17:48:20 crc kubenswrapper[4752]: I0227 17:48:20.057247 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-jm9qg" event={"ID":"f225bed8-10a2-4f7c-b1fa-cbd00a97e654","Type":"ContainerStarted","Data":"49904f0661897ef41fe7fb898df769f884e1ffb248191bed013d291a52a39ed4"} Feb 27 17:48:20 crc kubenswrapper[4752]: I0227 17:48:20.072919 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-2vhjd" podStartSLOduration=2.784510457 podStartE2EDuration="6.072893172s" podCreationTimestamp="2026-02-27 17:48:14 +0000 UTC" firstStartedPulling="2026-02-27 17:48:15.764505775 +0000 UTC m=+795.671322626" lastFinishedPulling="2026-02-27 17:48:19.05288848 +0000 UTC m=+798.959705341" observedRunningTime="2026-02-27 17:48:20.070022691 +0000 UTC m=+799.976839542" watchObservedRunningTime="2026-02-27 17:48:20.072893172 +0000 UTC m=+799.979710053" Feb 27 17:48:20 crc kubenswrapper[4752]: I0227 17:48:20.096573 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-xvqws" podStartSLOduration=2.709201818 podStartE2EDuration="6.09655155s" podCreationTimestamp="2026-02-27 17:48:14 +0000 UTC" firstStartedPulling="2026-02-27 17:48:15.649325586 +0000 UTC m=+795.556142437" lastFinishedPulling="2026-02-27 17:48:19.036675318 +0000 UTC m=+798.943492169" observedRunningTime="2026-02-27 17:48:20.0945523 +0000 UTC m=+800.001369151" watchObservedRunningTime="2026-02-27 17:48:20.09655155 +0000 UTC m=+800.003368401" Feb 27 17:48:24 crc kubenswrapper[4752]: E0227 17:48:24.910843 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536908-zvbsb" podUID="4babdb15-835b-4965-9af0-4a697c85f645" Feb 27 17:48:24 crc kubenswrapper[4752]: I0227 17:48:24.932526 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-jm9qg" podStartSLOduration=7.819025523 podStartE2EDuration="10.932490665s" podCreationTimestamp="2026-02-27 17:48:14 +0000 UTC" firstStartedPulling="2026-02-27 17:48:15.909628928 +0000 UTC m=+795.816445779" lastFinishedPulling="2026-02-27 17:48:19.02309407 +0000 UTC m=+798.929910921" observedRunningTime="2026-02-27 17:48:20.116259069 +0000 UTC m=+800.023075920" watchObservedRunningTime="2026-02-27 17:48:24.932490665 +0000 UTC m=+804.839307576" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.351328 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-xvqws" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.515024 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-sfztq"] Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.516769 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="ovn-controller" containerID="cri-o://c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8" gracePeriod=30 Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.516814 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="nbdb" containerID="cri-o://709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e" gracePeriod=30 Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.516926 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="kube-rbac-proxy-node" containerID="cri-o://dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b" gracePeriod=30 Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.516879 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9" gracePeriod=30 Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.516993 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="ovn-acl-logging" containerID="cri-o://543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9" gracePeriod=30 Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.517090 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="northd" containerID="cri-o://d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659" gracePeriod=30 Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.517250 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="sbdb" containerID="cri-o://39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca" gracePeriod=30 Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.559964 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="ovnkube-controller" containerID="cri-o://3a394d383b6a193745c850d33d33919b1f9dd811dea8d55f09a9724091edcf21" gracePeriod=30 Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.887758 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sfztq_690b0de6-1f38-4265-bfff-2077a349f89c/ovnkube-controller/3.log" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.890962 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sfztq_690b0de6-1f38-4265-bfff-2077a349f89c/ovn-acl-logging/0.log" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.891591 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sfztq_690b0de6-1f38-4265-bfff-2077a349f89c/ovn-controller/0.log" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.892191 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.963661 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8h42x"] Feb 27 17:48:25 crc kubenswrapper[4752]: E0227 17:48:25.964050 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="kube-rbac-proxy-ovn-metrics" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.964069 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="kube-rbac-proxy-ovn-metrics" Feb 27 17:48:25 crc kubenswrapper[4752]: E0227 17:48:25.964083 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="ovnkube-controller" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.964092 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="ovnkube-controller" Feb 27 17:48:25 crc kubenswrapper[4752]: E0227 17:48:25.964106 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="northd" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.964115 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="northd" Feb 27 17:48:25 crc kubenswrapper[4752]: E0227 17:48:25.964125 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="ovnkube-controller" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.964133 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="ovnkube-controller" Feb 27 17:48:25 crc kubenswrapper[4752]: E0227 17:48:25.964171 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="kube-rbac-proxy-node" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.964180 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="kube-rbac-proxy-node" Feb 27 17:48:25 crc kubenswrapper[4752]: E0227 17:48:25.964190 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="kubecfg-setup" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.964197 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="kubecfg-setup" Feb 27 17:48:25 crc kubenswrapper[4752]: E0227 17:48:25.964209 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="ovn-acl-logging" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.964217 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="ovn-acl-logging" Feb 27 17:48:25 crc kubenswrapper[4752]: E0227 17:48:25.964229 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="ovn-controller" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.964237 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="ovn-controller" Feb 27 17:48:25 crc kubenswrapper[4752]: E0227 17:48:25.964252 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="ovnkube-controller" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.964260 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="ovnkube-controller" Feb 27 17:48:25 crc kubenswrapper[4752]: E0227 17:48:25.964269 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="sbdb" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.964277 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="sbdb" Feb 27 17:48:25 crc kubenswrapper[4752]: E0227 17:48:25.964287 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="nbdb" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.964295 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="nbdb" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.964415 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="ovnkube-controller" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.964429 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="ovnkube-controller" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.964438 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="ovnkube-controller" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.964447 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="sbdb" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.964458 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="nbdb" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.964470 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="northd" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.964479 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="ovn-controller" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.964492 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="kube-rbac-proxy-ovn-metrics" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.964502 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="ovn-acl-logging" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.964516 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="kube-rbac-proxy-node" Feb 27 17:48:25 crc kubenswrapper[4752]: E0227 17:48:25.964636 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="ovnkube-controller" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.964645 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="ovnkube-controller" Feb 27 17:48:25 crc kubenswrapper[4752]: E0227 17:48:25.964657 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="ovnkube-controller" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.964665 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="ovnkube-controller" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.964777 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="ovnkube-controller" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.964794 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" containerName="ovnkube-controller" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.966952 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.994669 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-systemd-units\") pod \"690b0de6-1f38-4265-bfff-2077a349f89c\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.994714 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-log-socket\") pod \"690b0de6-1f38-4265-bfff-2077a349f89c\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.994754 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/690b0de6-1f38-4265-bfff-2077a349f89c-ovn-node-metrics-cert\") pod \"690b0de6-1f38-4265-bfff-2077a349f89c\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.994784 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-cni-netd\") pod \"690b0de6-1f38-4265-bfff-2077a349f89c\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.994814 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-run-systemd\") pod \"690b0de6-1f38-4265-bfff-2077a349f89c\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.994831 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-var-lib-openvswitch\") pod \"690b0de6-1f38-4265-bfff-2077a349f89c\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.994862 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhb87\" (UniqueName: \"kubernetes.io/projected/690b0de6-1f38-4265-bfff-2077a349f89c-kube-api-access-fhb87\") pod \"690b0de6-1f38-4265-bfff-2077a349f89c\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.994854 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "690b0de6-1f38-4265-bfff-2077a349f89c" (UID: "690b0de6-1f38-4265-bfff-2077a349f89c"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.994913 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-log-socket" (OuterVolumeSpecName: "log-socket") pod "690b0de6-1f38-4265-bfff-2077a349f89c" (UID: "690b0de6-1f38-4265-bfff-2077a349f89c"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.994917 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "690b0de6-1f38-4265-bfff-2077a349f89c" (UID: "690b0de6-1f38-4265-bfff-2077a349f89c"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.994978 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "690b0de6-1f38-4265-bfff-2077a349f89c" (UID: "690b0de6-1f38-4265-bfff-2077a349f89c"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.994881 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-run-openvswitch\") pod \"690b0de6-1f38-4265-bfff-2077a349f89c\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.995041 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "690b0de6-1f38-4265-bfff-2077a349f89c" (UID: "690b0de6-1f38-4265-bfff-2077a349f89c"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.995299 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-run-ovn\") pod \"690b0de6-1f38-4265-bfff-2077a349f89c\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.995368 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "690b0de6-1f38-4265-bfff-2077a349f89c" (UID: "690b0de6-1f38-4265-bfff-2077a349f89c"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.995838 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-node-log\") pod \"690b0de6-1f38-4265-bfff-2077a349f89c\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.995891 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/690b0de6-1f38-4265-bfff-2077a349f89c-ovnkube-script-lib\") pod \"690b0de6-1f38-4265-bfff-2077a349f89c\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.995989 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-etc-openvswitch\") pod \"690b0de6-1f38-4265-bfff-2077a349f89c\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.995984 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-node-log" (OuterVolumeSpecName: "node-log") pod "690b0de6-1f38-4265-bfff-2077a349f89c" (UID: "690b0de6-1f38-4265-bfff-2077a349f89c"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.996031 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "690b0de6-1f38-4265-bfff-2077a349f89c" (UID: "690b0de6-1f38-4265-bfff-2077a349f89c"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.996292 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-slash\") pod \"690b0de6-1f38-4265-bfff-2077a349f89c\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.996332 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-kubelet\") pod \"690b0de6-1f38-4265-bfff-2077a349f89c\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.996367 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-run-ovn-kubernetes\") pod \"690b0de6-1f38-4265-bfff-2077a349f89c\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.996438 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-slash" (OuterVolumeSpecName: "host-slash") pod "690b0de6-1f38-4265-bfff-2077a349f89c" (UID: "690b0de6-1f38-4265-bfff-2077a349f89c"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.996492 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/690b0de6-1f38-4265-bfff-2077a349f89c-ovnkube-config\") pod \"690b0de6-1f38-4265-bfff-2077a349f89c\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.996492 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "690b0de6-1f38-4265-bfff-2077a349f89c" (UID: "690b0de6-1f38-4265-bfff-2077a349f89c"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.996526 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/690b0de6-1f38-4265-bfff-2077a349f89c-env-overrides\") pod \"690b0de6-1f38-4265-bfff-2077a349f89c\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.996579 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-cni-bin\") pod \"690b0de6-1f38-4265-bfff-2077a349f89c\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.996620 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"690b0de6-1f38-4265-bfff-2077a349f89c\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.996652 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-run-netns\") pod \"690b0de6-1f38-4265-bfff-2077a349f89c\" (UID: \"690b0de6-1f38-4265-bfff-2077a349f89c\") " Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.996570 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "690b0de6-1f38-4265-bfff-2077a349f89c" (UID: "690b0de6-1f38-4265-bfff-2077a349f89c"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.996724 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "690b0de6-1f38-4265-bfff-2077a349f89c" (UID: "690b0de6-1f38-4265-bfff-2077a349f89c"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.996741 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "690b0de6-1f38-4265-bfff-2077a349f89c" (UID: "690b0de6-1f38-4265-bfff-2077a349f89c"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.996791 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "690b0de6-1f38-4265-bfff-2077a349f89c" (UID: "690b0de6-1f38-4265-bfff-2077a349f89c"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.997211 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/690b0de6-1f38-4265-bfff-2077a349f89c-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "690b0de6-1f38-4265-bfff-2077a349f89c" (UID: "690b0de6-1f38-4265-bfff-2077a349f89c"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.997293 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-host-kubelet\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.997334 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-etc-openvswitch\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.997374 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/122db0b3-4ecb-48df-8529-ecdc8beaac99-env-overrides\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.997522 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/690b0de6-1f38-4265-bfff-2077a349f89c-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "690b0de6-1f38-4265-bfff-2077a349f89c" (UID: "690b0de6-1f38-4265-bfff-2077a349f89c"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.997572 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-host-cni-netd\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.997591 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/690b0de6-1f38-4265-bfff-2077a349f89c-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "690b0de6-1f38-4265-bfff-2077a349f89c" (UID: "690b0de6-1f38-4265-bfff-2077a349f89c"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.997687 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-run-systemd\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.997884 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-host-run-ovn-kubernetes\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.998135 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/122db0b3-4ecb-48df-8529-ecdc8beaac99-ovnkube-config\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.998310 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-systemd-units\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.999259 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-host-slash\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.999340 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-run-openvswitch\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.999429 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-node-log\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.999521 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/122db0b3-4ecb-48df-8529-ecdc8beaac99-ovn-node-metrics-cert\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.999615 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-log-socket\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:25 crc kubenswrapper[4752]: I0227 17:48:25.999789 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-run-ovn\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:25.999934 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-var-lib-openvswitch\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:25.999993 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-host-cni-bin\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.000026 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/122db0b3-4ecb-48df-8529-ecdc8beaac99-ovnkube-script-lib\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.000263 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.000307 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4vfk\" (UniqueName: \"kubernetes.io/projected/122db0b3-4ecb-48df-8529-ecdc8beaac99-kube-api-access-j4vfk\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.000356 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-host-run-netns\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.000445 4752 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.000468 4752 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.000487 4752 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-node-log\") on node \"crc\" DevicePath \"\"" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.000504 4752 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/690b0de6-1f38-4265-bfff-2077a349f89c-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.002302 4752 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.002335 4752 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-slash\") on node \"crc\" DevicePath \"\"" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.002354 4752 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.002373 4752 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.002398 4752 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/690b0de6-1f38-4265-bfff-2077a349f89c-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.002416 4752 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/690b0de6-1f38-4265-bfff-2077a349f89c-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.002433 4752 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.002454 4752 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.002474 4752 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.002495 4752 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.002512 4752 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-log-socket\") on node \"crc\" DevicePath \"\"" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.002530 4752 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.002547 4752 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.001884 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/690b0de6-1f38-4265-bfff-2077a349f89c-kube-api-access-fhb87" (OuterVolumeSpecName: "kube-api-access-fhb87") pod "690b0de6-1f38-4265-bfff-2077a349f89c" (UID: "690b0de6-1f38-4265-bfff-2077a349f89c"). InnerVolumeSpecName "kube-api-access-fhb87". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.012373 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/690b0de6-1f38-4265-bfff-2077a349f89c-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "690b0de6-1f38-4265-bfff-2077a349f89c" (UID: "690b0de6-1f38-4265-bfff-2077a349f89c"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.018555 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "690b0de6-1f38-4265-bfff-2077a349f89c" (UID: "690b0de6-1f38-4265-bfff-2077a349f89c"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.103585 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-host-slash\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.103670 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-run-openvswitch\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.103706 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-node-log\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.103747 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/122db0b3-4ecb-48df-8529-ecdc8beaac99-ovn-node-metrics-cert\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.103763 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-host-slash\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.103782 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-log-socket\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.103845 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-log-socket\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.103841 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-run-openvswitch\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.103880 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-run-ovn\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.103895 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-node-log\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.103956 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-var-lib-openvswitch\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.104046 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-host-cni-bin\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.104088 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.104131 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/122db0b3-4ecb-48df-8529-ecdc8beaac99-ovnkube-script-lib\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.103917 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-run-ovn\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.104045 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-var-lib-openvswitch\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.104206 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4vfk\" (UniqueName: \"kubernetes.io/projected/122db0b3-4ecb-48df-8529-ecdc8beaac99-kube-api-access-j4vfk\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.104264 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-host-run-netns\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.104312 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-host-kubelet\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.104349 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-etc-openvswitch\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.104140 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.104397 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/122db0b3-4ecb-48df-8529-ecdc8beaac99-env-overrides\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.104094 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-host-cni-bin\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.104437 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-host-cni-netd\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.104469 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-run-systemd\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.104520 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-host-run-ovn-kubernetes\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.104571 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/122db0b3-4ecb-48df-8529-ecdc8beaac99-ovnkube-config\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.104610 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-systemd-units\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.104708 4752 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/690b0de6-1f38-4265-bfff-2077a349f89c-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.104737 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhb87\" (UniqueName: \"kubernetes.io/projected/690b0de6-1f38-4265-bfff-2077a349f89c-kube-api-access-fhb87\") on node \"crc\" DevicePath \"\"" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.104766 4752 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/690b0de6-1f38-4265-bfff-2077a349f89c-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.104836 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-systemd-units\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.104890 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-host-run-netns\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.104936 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-host-kubelet\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.104978 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-host-cni-netd\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.104990 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-run-systemd\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.105035 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-host-run-ovn-kubernetes\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.105058 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/122db0b3-4ecb-48df-8529-ecdc8beaac99-etc-openvswitch\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.105345 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/122db0b3-4ecb-48df-8529-ecdc8beaac99-ovnkube-script-lib\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.105761 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/122db0b3-4ecb-48df-8529-ecdc8beaac99-env-overrides\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.106239 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/122db0b3-4ecb-48df-8529-ecdc8beaac99-ovnkube-config\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.110093 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/122db0b3-4ecb-48df-8529-ecdc8beaac99-ovn-node-metrics-cert\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.111051 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qpbx6_098f70a1-c2c2-44ce-9c0c-356e7eea2da9/kube-multus/2.log" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.111886 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qpbx6_098f70a1-c2c2-44ce-9c0c-356e7eea2da9/kube-multus/1.log" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.111959 4752 generic.go:334] "Generic (PLEG): container finished" podID="098f70a1-c2c2-44ce-9c0c-356e7eea2da9" containerID="5c2dfd87b1efc712de9db66e893f49e0c21e3f77daea298231d059ff786e13ea" exitCode=2 Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.112076 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qpbx6" event={"ID":"098f70a1-c2c2-44ce-9c0c-356e7eea2da9","Type":"ContainerDied","Data":"5c2dfd87b1efc712de9db66e893f49e0c21e3f77daea298231d059ff786e13ea"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.112125 4752 scope.go:117] "RemoveContainer" containerID="d6903b8a1bfbbe982436ccb9b24b019f8996d6229249cc62f025c32ecbb56efe" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.112803 4752 scope.go:117] "RemoveContainer" containerID="5c2dfd87b1efc712de9db66e893f49e0c21e3f77daea298231d059ff786e13ea" Feb 27 17:48:26 crc kubenswrapper[4752]: E0227 17:48:26.113243 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-qpbx6_openshift-multus(098f70a1-c2c2-44ce-9c0c-356e7eea2da9)\"" pod="openshift-multus/multus-qpbx6" podUID="098f70a1-c2c2-44ce-9c0c-356e7eea2da9" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.117461 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sfztq_690b0de6-1f38-4265-bfff-2077a349f89c/ovnkube-controller/3.log" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.120529 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sfztq_690b0de6-1f38-4265-bfff-2077a349f89c/ovn-acl-logging/0.log" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.129710 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sfztq_690b0de6-1f38-4265-bfff-2077a349f89c/ovn-controller/0.log" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.130888 4752 generic.go:334] "Generic (PLEG): container finished" podID="690b0de6-1f38-4265-bfff-2077a349f89c" containerID="3a394d383b6a193745c850d33d33919b1f9dd811dea8d55f09a9724091edcf21" exitCode=0 Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.130933 4752 generic.go:334] "Generic (PLEG): container finished" podID="690b0de6-1f38-4265-bfff-2077a349f89c" containerID="39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca" exitCode=0 Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.130955 4752 generic.go:334] "Generic (PLEG): container finished" podID="690b0de6-1f38-4265-bfff-2077a349f89c" containerID="709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e" exitCode=0 Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.130974 4752 generic.go:334] "Generic (PLEG): container finished" podID="690b0de6-1f38-4265-bfff-2077a349f89c" containerID="d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659" exitCode=0 Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.131002 4752 generic.go:334] "Generic (PLEG): container finished" podID="690b0de6-1f38-4265-bfff-2077a349f89c" containerID="62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9" exitCode=0 Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.130962 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" event={"ID":"690b0de6-1f38-4265-bfff-2077a349f89c","Type":"ContainerDied","Data":"3a394d383b6a193745c850d33d33919b1f9dd811dea8d55f09a9724091edcf21"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.131075 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" event={"ID":"690b0de6-1f38-4265-bfff-2077a349f89c","Type":"ContainerDied","Data":"39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.131111 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" event={"ID":"690b0de6-1f38-4265-bfff-2077a349f89c","Type":"ContainerDied","Data":"709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.131138 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" event={"ID":"690b0de6-1f38-4265-bfff-2077a349f89c","Type":"ContainerDied","Data":"d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.131017 4752 generic.go:334] "Generic (PLEG): container finished" podID="690b0de6-1f38-4265-bfff-2077a349f89c" containerID="dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b" exitCode=0 Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.131253 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" event={"ID":"690b0de6-1f38-4265-bfff-2077a349f89c","Type":"ContainerDied","Data":"62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.131209 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.134317 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" event={"ID":"690b0de6-1f38-4265-bfff-2077a349f89c","Type":"ContainerDied","Data":"dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.134364 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3a394d383b6a193745c850d33d33919b1f9dd811dea8d55f09a9724091edcf21"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.134384 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.134396 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.134408 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.134419 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.134430 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.134444 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.134454 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.134465 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.134475 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.134491 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" event={"ID":"690b0de6-1f38-4265-bfff-2077a349f89c","Type":"ContainerDied","Data":"543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.134508 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3a394d383b6a193745c850d33d33919b1f9dd811dea8d55f09a9724091edcf21"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.134520 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.134532 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.134544 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.134555 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.134566 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.134577 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.134588 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.134598 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.134609 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.138118 4752 generic.go:334] "Generic (PLEG): container finished" podID="690b0de6-1f38-4265-bfff-2077a349f89c" containerID="543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9" exitCode=143 Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.138243 4752 generic.go:334] "Generic (PLEG): container finished" podID="690b0de6-1f38-4265-bfff-2077a349f89c" containerID="c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8" exitCode=143 Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.138296 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4vfk\" (UniqueName: \"kubernetes.io/projected/122db0b3-4ecb-48df-8529-ecdc8beaac99-kube-api-access-j4vfk\") pod \"ovnkube-node-8h42x\" (UID: \"122db0b3-4ecb-48df-8529-ecdc8beaac99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.138298 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" event={"ID":"690b0de6-1f38-4265-bfff-2077a349f89c","Type":"ContainerDied","Data":"c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.138345 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3a394d383b6a193745c850d33d33919b1f9dd811dea8d55f09a9724091edcf21"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.138372 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.138386 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.138400 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.138412 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.138426 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.138440 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.138455 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.138470 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.138484 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.138512 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sfztq" event={"ID":"690b0de6-1f38-4265-bfff-2077a349f89c","Type":"ContainerDied","Data":"90f139d1a00ad4a16e4501b5ab1653634bf1e35e8f3070d69bd8d242bee3d228"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.138540 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3a394d383b6a193745c850d33d33919b1f9dd811dea8d55f09a9724091edcf21"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.138559 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.138574 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.138590 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.138605 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.138620 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.138635 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.138649 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.138663 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.138678 4752 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec"} Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.161060 4752 scope.go:117] "RemoveContainer" containerID="3a394d383b6a193745c850d33d33919b1f9dd811dea8d55f09a9724091edcf21" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.189130 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-sfztq"] Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.194269 4752 scope.go:117] "RemoveContainer" containerID="6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.197935 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-sfztq"] Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.216663 4752 scope.go:117] "RemoveContainer" containerID="39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.237351 4752 scope.go:117] "RemoveContainer" containerID="709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.255934 4752 scope.go:117] "RemoveContainer" containerID="d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.277029 4752 scope.go:117] "RemoveContainer" containerID="62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.291717 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.298310 4752 scope.go:117] "RemoveContainer" containerID="dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.325288 4752 scope.go:117] "RemoveContainer" containerID="543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.343624 4752 scope.go:117] "RemoveContainer" containerID="c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.378951 4752 scope.go:117] "RemoveContainer" containerID="00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.404917 4752 scope.go:117] "RemoveContainer" containerID="3a394d383b6a193745c850d33d33919b1f9dd811dea8d55f09a9724091edcf21" Feb 27 17:48:26 crc kubenswrapper[4752]: E0227 17:48:26.405997 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a394d383b6a193745c850d33d33919b1f9dd811dea8d55f09a9724091edcf21\": container with ID starting with 3a394d383b6a193745c850d33d33919b1f9dd811dea8d55f09a9724091edcf21 not found: ID does not exist" containerID="3a394d383b6a193745c850d33d33919b1f9dd811dea8d55f09a9724091edcf21" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.406067 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a394d383b6a193745c850d33d33919b1f9dd811dea8d55f09a9724091edcf21"} err="failed to get container status \"3a394d383b6a193745c850d33d33919b1f9dd811dea8d55f09a9724091edcf21\": rpc error: code = NotFound desc = could not find container \"3a394d383b6a193745c850d33d33919b1f9dd811dea8d55f09a9724091edcf21\": container with ID starting with 3a394d383b6a193745c850d33d33919b1f9dd811dea8d55f09a9724091edcf21 not found: ID does not exist" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.406124 4752 scope.go:117] "RemoveContainer" containerID="6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a" Feb 27 17:48:26 crc kubenswrapper[4752]: E0227 17:48:26.406745 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a\": container with ID starting with 6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a not found: ID does not exist" containerID="6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.406792 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a"} err="failed to get container status \"6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a\": rpc error: code = NotFound desc = could not find container \"6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a\": container with ID starting with 6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a not found: ID does not exist" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.406826 4752 scope.go:117] "RemoveContainer" containerID="39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca" Feb 27 17:48:26 crc kubenswrapper[4752]: E0227 17:48:26.407475 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca\": container with ID starting with 39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca not found: ID does not exist" containerID="39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.407561 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca"} err="failed to get container status \"39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca\": rpc error: code = NotFound desc = could not find container \"39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca\": container with ID starting with 39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca not found: ID does not exist" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.407604 4752 scope.go:117] "RemoveContainer" containerID="709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e" Feb 27 17:48:26 crc kubenswrapper[4752]: E0227 17:48:26.408206 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e\": container with ID starting with 709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e not found: ID does not exist" containerID="709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.408249 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e"} err="failed to get container status \"709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e\": rpc error: code = NotFound desc = could not find container \"709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e\": container with ID starting with 709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e not found: ID does not exist" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.408274 4752 scope.go:117] "RemoveContainer" containerID="d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659" Feb 27 17:48:26 crc kubenswrapper[4752]: E0227 17:48:26.408707 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659\": container with ID starting with d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659 not found: ID does not exist" containerID="d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.408777 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659"} err="failed to get container status \"d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659\": rpc error: code = NotFound desc = could not find container \"d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659\": container with ID starting with d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659 not found: ID does not exist" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.408828 4752 scope.go:117] "RemoveContainer" containerID="62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9" Feb 27 17:48:26 crc kubenswrapper[4752]: E0227 17:48:26.409264 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9\": container with ID starting with 62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9 not found: ID does not exist" containerID="62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.409312 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9"} err="failed to get container status \"62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9\": rpc error: code = NotFound desc = could not find container \"62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9\": container with ID starting with 62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9 not found: ID does not exist" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.409345 4752 scope.go:117] "RemoveContainer" containerID="dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b" Feb 27 17:48:26 crc kubenswrapper[4752]: E0227 17:48:26.409667 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b\": container with ID starting with dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b not found: ID does not exist" containerID="dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.409711 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b"} err="failed to get container status \"dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b\": rpc error: code = NotFound desc = could not find container \"dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b\": container with ID starting with dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b not found: ID does not exist" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.409737 4752 scope.go:117] "RemoveContainer" containerID="543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9" Feb 27 17:48:26 crc kubenswrapper[4752]: E0227 17:48:26.410686 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9\": container with ID starting with 543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9 not found: ID does not exist" containerID="543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.410751 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9"} err="failed to get container status \"543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9\": rpc error: code = NotFound desc = could not find container \"543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9\": container with ID starting with 543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9 not found: ID does not exist" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.410796 4752 scope.go:117] "RemoveContainer" containerID="c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8" Feb 27 17:48:26 crc kubenswrapper[4752]: E0227 17:48:26.411257 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8\": container with ID starting with c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8 not found: ID does not exist" containerID="c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.411335 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8"} err="failed to get container status \"c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8\": rpc error: code = NotFound desc = could not find container \"c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8\": container with ID starting with c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8 not found: ID does not exist" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.411378 4752 scope.go:117] "RemoveContainer" containerID="00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec" Feb 27 17:48:26 crc kubenswrapper[4752]: E0227 17:48:26.412095 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\": container with ID starting with 00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec not found: ID does not exist" containerID="00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.412191 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec"} err="failed to get container status \"00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\": rpc error: code = NotFound desc = could not find container \"00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\": container with ID starting with 00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec not found: ID does not exist" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.412221 4752 scope.go:117] "RemoveContainer" containerID="3a394d383b6a193745c850d33d33919b1f9dd811dea8d55f09a9724091edcf21" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.412723 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a394d383b6a193745c850d33d33919b1f9dd811dea8d55f09a9724091edcf21"} err="failed to get container status \"3a394d383b6a193745c850d33d33919b1f9dd811dea8d55f09a9724091edcf21\": rpc error: code = NotFound desc = could not find container \"3a394d383b6a193745c850d33d33919b1f9dd811dea8d55f09a9724091edcf21\": container with ID starting with 3a394d383b6a193745c850d33d33919b1f9dd811dea8d55f09a9724091edcf21 not found: ID does not exist" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.412760 4752 scope.go:117] "RemoveContainer" containerID="6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.413107 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a"} err="failed to get container status \"6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a\": rpc error: code = NotFound desc = could not find container \"6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a\": container with ID starting with 6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a not found: ID does not exist" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.413135 4752 scope.go:117] "RemoveContainer" containerID="39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.413494 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca"} err="failed to get container status \"39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca\": rpc error: code = NotFound desc = could not find container \"39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca\": container with ID starting with 39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca not found: ID does not exist" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.413522 4752 scope.go:117] "RemoveContainer" containerID="709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.413852 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e"} err="failed to get container status \"709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e\": rpc error: code = NotFound desc = could not find container \"709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e\": container with ID starting with 709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e not found: ID does not exist" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.413898 4752 scope.go:117] "RemoveContainer" containerID="d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.414675 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659"} err="failed to get container status \"d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659\": rpc error: code = NotFound desc = could not find container \"d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659\": container with ID starting with d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659 not found: ID does not exist" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.414702 4752 scope.go:117] "RemoveContainer" containerID="62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.414993 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9"} err="failed to get container status \"62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9\": rpc error: code = NotFound desc = could not find container \"62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9\": container with ID starting with 62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9 not found: ID does not exist" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.415039 4752 scope.go:117] "RemoveContainer" containerID="dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.415477 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b"} err="failed to get container status \"dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b\": rpc error: code = NotFound desc = could not find container \"dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b\": container with ID starting with dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b not found: ID does not exist" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.415502 4752 scope.go:117] "RemoveContainer" containerID="543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.415794 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9"} err="failed to get container status \"543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9\": rpc error: code = NotFound desc = could not find container \"543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9\": container with ID starting with 543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9 not found: ID does not exist" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.415845 4752 scope.go:117] "RemoveContainer" containerID="c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.416431 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8"} err="failed to get container status \"c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8\": rpc error: code = NotFound desc = could not find container \"c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8\": container with ID starting with c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8 not found: ID does not exist" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.416468 4752 scope.go:117] "RemoveContainer" containerID="00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.416888 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec"} err="failed to get container status \"00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\": rpc error: code = NotFound desc = could not find container \"00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\": container with ID starting with 00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec not found: ID does not exist" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.416933 4752 scope.go:117] "RemoveContainer" containerID="3a394d383b6a193745c850d33d33919b1f9dd811dea8d55f09a9724091edcf21" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.417248 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a394d383b6a193745c850d33d33919b1f9dd811dea8d55f09a9724091edcf21"} err="failed to get container status \"3a394d383b6a193745c850d33d33919b1f9dd811dea8d55f09a9724091edcf21\": rpc error: code = NotFound desc = could not find container \"3a394d383b6a193745c850d33d33919b1f9dd811dea8d55f09a9724091edcf21\": container with ID starting with 3a394d383b6a193745c850d33d33919b1f9dd811dea8d55f09a9724091edcf21 not found: ID does not exist" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.417287 4752 scope.go:117] "RemoveContainer" containerID="6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.417733 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a"} err="failed to get container status \"6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a\": rpc error: code = NotFound desc = could not find container \"6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a\": container with ID starting with 6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a not found: ID does not exist" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.417771 4752 scope.go:117] "RemoveContainer" containerID="39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.418136 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca"} err="failed to get container status \"39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca\": rpc error: code = NotFound desc = could not find container \"39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca\": container with ID starting with 39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca not found: ID does not exist" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.418204 4752 scope.go:117] "RemoveContainer" containerID="709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.418548 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e"} err="failed to get container status \"709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e\": rpc error: code = NotFound desc = could not find container \"709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e\": container with ID starting with 709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e not found: ID does not exist" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.418593 4752 scope.go:117] "RemoveContainer" containerID="d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.418935 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659"} err="failed to get container status \"d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659\": rpc error: code = NotFound desc = could not find container \"d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659\": container with ID starting with d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659 not found: ID does not exist" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.418975 4752 scope.go:117] "RemoveContainer" containerID="62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.419366 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9"} err="failed to get container status \"62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9\": rpc error: code = NotFound desc = could not find container \"62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9\": container with ID starting with 62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9 not found: ID does not exist" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.419416 4752 scope.go:117] "RemoveContainer" containerID="dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.419757 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b"} err="failed to get container status \"dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b\": rpc error: code = NotFound desc = could not find container \"dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b\": container with ID starting with dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b not found: ID does not exist" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.419800 4752 scope.go:117] "RemoveContainer" containerID="543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.420121 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9"} err="failed to get container status \"543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9\": rpc error: code = NotFound desc = could not find container \"543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9\": container with ID starting with 543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9 not found: ID does not exist" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.420220 4752 scope.go:117] "RemoveContainer" containerID="c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.420527 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8"} err="failed to get container status \"c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8\": rpc error: code = NotFound desc = could not find container \"c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8\": container with ID starting with c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8 not found: ID does not exist" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.420571 4752 scope.go:117] "RemoveContainer" containerID="00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.420846 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec"} err="failed to get container status \"00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\": rpc error: code = NotFound desc = could not find container \"00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\": container with ID starting with 00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec not found: ID does not exist" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.420881 4752 scope.go:117] "RemoveContainer" containerID="3a394d383b6a193745c850d33d33919b1f9dd811dea8d55f09a9724091edcf21" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.421184 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a394d383b6a193745c850d33d33919b1f9dd811dea8d55f09a9724091edcf21"} err="failed to get container status \"3a394d383b6a193745c850d33d33919b1f9dd811dea8d55f09a9724091edcf21\": rpc error: code = NotFound desc = could not find container \"3a394d383b6a193745c850d33d33919b1f9dd811dea8d55f09a9724091edcf21\": container with ID starting with 3a394d383b6a193745c850d33d33919b1f9dd811dea8d55f09a9724091edcf21 not found: ID does not exist" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.421223 4752 scope.go:117] "RemoveContainer" containerID="6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.421806 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a"} err="failed to get container status \"6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a\": rpc error: code = NotFound desc = could not find container \"6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a\": container with ID starting with 6730918216e454fd598baff9fabbe3613c94afe09b7c6c3f75f88e2a7fad052a not found: ID does not exist" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.421840 4752 scope.go:117] "RemoveContainer" containerID="39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.422891 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca"} err="failed to get container status \"39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca\": rpc error: code = NotFound desc = could not find container \"39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca\": container with ID starting with 39aba08281ec72b0ca80d496c5de2f835023e81319b2b6a84df61c7aba850dca not found: ID does not exist" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.422928 4752 scope.go:117] "RemoveContainer" containerID="709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.423433 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e"} err="failed to get container status \"709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e\": rpc error: code = NotFound desc = could not find container \"709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e\": container with ID starting with 709c7d153f050e5c78fbf48a22c71ba5f81b2e03358e753da2c2879365cbbb9e not found: ID does not exist" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.423470 4752 scope.go:117] "RemoveContainer" containerID="d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.423777 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659"} err="failed to get container status \"d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659\": rpc error: code = NotFound desc = could not find container \"d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659\": container with ID starting with d66a664b45c3b7abedf74c88a008aa4b56f9387b5d1573c5b9ab935dec40a659 not found: ID does not exist" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.423834 4752 scope.go:117] "RemoveContainer" containerID="62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.424246 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9"} err="failed to get container status \"62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9\": rpc error: code = NotFound desc = could not find container \"62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9\": container with ID starting with 62bb08ab408cf9fa1b712de52b83a40ab9bac739163f6dbe9762f16035b6b2e9 not found: ID does not exist" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.424292 4752 scope.go:117] "RemoveContainer" containerID="dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.426426 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b"} err="failed to get container status \"dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b\": rpc error: code = NotFound desc = could not find container \"dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b\": container with ID starting with dfe47f35a683410d58c76461188157485111843e46f700e67aafe4b4b0496a9b not found: ID does not exist" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.426482 4752 scope.go:117] "RemoveContainer" containerID="543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.427315 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9"} err="failed to get container status \"543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9\": rpc error: code = NotFound desc = could not find container \"543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9\": container with ID starting with 543c4b0032e57ad25545e974ebdefa4b741a5790455d00cf86ebd9b9101598b9 not found: ID does not exist" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.427360 4752 scope.go:117] "RemoveContainer" containerID="c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.427714 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8"} err="failed to get container status \"c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8\": rpc error: code = NotFound desc = could not find container \"c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8\": container with ID starting with c52f95a3ef097f1e5c9a73d718b91bd4524c03457f2bbdc398f274b6622341c8 not found: ID does not exist" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.428228 4752 scope.go:117] "RemoveContainer" containerID="00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.428642 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec"} err="failed to get container status \"00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\": rpc error: code = NotFound desc = could not find container \"00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec\": container with ID starting with 00681c0f19b44d3aedf4a3cbcb6381d2302b6ba5338bb0fc1fbe39e820aff4ec not found: ID does not exist" Feb 27 17:48:26 crc kubenswrapper[4752]: I0227 17:48:26.919815 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="690b0de6-1f38-4265-bfff-2077a349f89c" path="/var/lib/kubelet/pods/690b0de6-1f38-4265-bfff-2077a349f89c/volumes" Feb 27 17:48:27 crc kubenswrapper[4752]: I0227 17:48:27.146120 4752 generic.go:334] "Generic (PLEG): container finished" podID="122db0b3-4ecb-48df-8529-ecdc8beaac99" containerID="0e3b6b84140ecfefb02c7042c8e655db965466e14c5b064dbefe3a070314dc0b" exitCode=0 Feb 27 17:48:27 crc kubenswrapper[4752]: I0227 17:48:27.146241 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" event={"ID":"122db0b3-4ecb-48df-8529-ecdc8beaac99","Type":"ContainerDied","Data":"0e3b6b84140ecfefb02c7042c8e655db965466e14c5b064dbefe3a070314dc0b"} Feb 27 17:48:27 crc kubenswrapper[4752]: I0227 17:48:27.146280 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" event={"ID":"122db0b3-4ecb-48df-8529-ecdc8beaac99","Type":"ContainerStarted","Data":"76e94b44f059e1b9240cad5c4e9335e3d40e1d4ff63483892d2b4f3de14dfd1e"} Feb 27 17:48:27 crc kubenswrapper[4752]: I0227 17:48:27.154668 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qpbx6_098f70a1-c2c2-44ce-9c0c-356e7eea2da9/kube-multus/2.log" Feb 27 17:48:28 crc kubenswrapper[4752]: I0227 17:48:28.164844 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" event={"ID":"122db0b3-4ecb-48df-8529-ecdc8beaac99","Type":"ContainerStarted","Data":"0e574d6f31cf9da42de32868dfd86be5f501d0d629ee5d936b8ae58fea38c602"} Feb 27 17:48:28 crc kubenswrapper[4752]: I0227 17:48:28.165233 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" event={"ID":"122db0b3-4ecb-48df-8529-ecdc8beaac99","Type":"ContainerStarted","Data":"412138ce64013172f21892848f2b3366e26c91de92ee3c1a48cd1aee7069574d"} Feb 27 17:48:28 crc kubenswrapper[4752]: I0227 17:48:28.165247 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" event={"ID":"122db0b3-4ecb-48df-8529-ecdc8beaac99","Type":"ContainerStarted","Data":"dd688b4b31440ffa6562f13fa84a145462f1aa5e0e6f1714306391b55ec2fa09"} Feb 27 17:48:28 crc kubenswrapper[4752]: I0227 17:48:28.165257 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" event={"ID":"122db0b3-4ecb-48df-8529-ecdc8beaac99","Type":"ContainerStarted","Data":"6ada65cefd497d5beff1b33ac696423497cabf6d1cabde1a119481dcb464503b"} Feb 27 17:48:28 crc kubenswrapper[4752]: I0227 17:48:28.165265 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" event={"ID":"122db0b3-4ecb-48df-8529-ecdc8beaac99","Type":"ContainerStarted","Data":"ae22dab615fd24ba40f81776827d141fa32a066ae2a912f1521fdd2524edf352"} Feb 27 17:48:28 crc kubenswrapper[4752]: I0227 17:48:28.165274 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" event={"ID":"122db0b3-4ecb-48df-8529-ecdc8beaac99","Type":"ContainerStarted","Data":"83b30d65a210cfada244f6694d7a2a31c1f2bafefe86675fd03444c2f0bdda82"} Feb 27 17:48:31 crc kubenswrapper[4752]: I0227 17:48:31.190366 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" event={"ID":"122db0b3-4ecb-48df-8529-ecdc8beaac99","Type":"ContainerStarted","Data":"a7e6b9eabd62c16f71bfc3ee87aa7bd89ed850d4c912b521b9c45ee440544385"} Feb 27 17:48:33 crc kubenswrapper[4752]: I0227 17:48:33.218936 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" event={"ID":"122db0b3-4ecb-48df-8529-ecdc8beaac99","Type":"ContainerStarted","Data":"ee61b58a789b41854019779ca2352a33d82f2c71ce13b090a6f54c89063b35d3"} Feb 27 17:48:33 crc kubenswrapper[4752]: I0227 17:48:33.219389 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:33 crc kubenswrapper[4752]: I0227 17:48:33.263235 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" podStartSLOduration=8.263193156 podStartE2EDuration="8.263193156s" podCreationTimestamp="2026-02-27 17:48:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:48:33.259104535 +0000 UTC m=+813.165921426" watchObservedRunningTime="2026-02-27 17:48:33.263193156 +0000 UTC m=+813.170010027" Feb 27 17:48:33 crc kubenswrapper[4752]: I0227 17:48:33.293452 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:34 crc kubenswrapper[4752]: I0227 17:48:34.225929 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:34 crc kubenswrapper[4752]: I0227 17:48:34.226369 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:34 crc kubenswrapper[4752]: I0227 17:48:34.268503 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:36 crc kubenswrapper[4752]: I0227 17:48:36.323603 4752 patch_prober.go:28] interesting pod/machine-config-daemon-cm8wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 17:48:36 crc kubenswrapper[4752]: I0227 17:48:36.324019 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 17:48:36 crc kubenswrapper[4752]: E0227 17:48:36.814313 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 27 17:48:36 crc kubenswrapper[4752]: E0227 17:48:36.814503 4752 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 17:48:36 crc kubenswrapper[4752]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 27 17:48:36 crc kubenswrapper[4752]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mbqcp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29536908-zvbsb_openshift-infra(4babdb15-835b-4965-9af0-4a697c85f645): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error) Feb 27 17:48:36 crc kubenswrapper[4752]: > logger="UnhandledError" Feb 27 17:48:36 crc kubenswrapper[4752]: E0227 17:48:36.815715 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)\"" pod="openshift-infra/auto-csr-approver-29536908-zvbsb" podUID="4babdb15-835b-4965-9af0-4a697c85f645" Feb 27 17:48:41 crc kubenswrapper[4752]: I0227 17:48:41.906969 4752 scope.go:117] "RemoveContainer" containerID="5c2dfd87b1efc712de9db66e893f49e0c21e3f77daea298231d059ff786e13ea" Feb 27 17:48:41 crc kubenswrapper[4752]: E0227 17:48:41.907423 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-qpbx6_openshift-multus(098f70a1-c2c2-44ce-9c0c-356e7eea2da9)\"" pod="openshift-multus/multus-qpbx6" podUID="098f70a1-c2c2-44ce-9c0c-356e7eea2da9" Feb 27 17:48:48 crc kubenswrapper[4752]: E0227 17:48:48.912298 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536908-zvbsb" podUID="4babdb15-835b-4965-9af0-4a697c85f645" Feb 27 17:48:52 crc kubenswrapper[4752]: I0227 17:48:52.906421 4752 scope.go:117] "RemoveContainer" containerID="5c2dfd87b1efc712de9db66e893f49e0c21e3f77daea298231d059ff786e13ea" Feb 27 17:48:53 crc kubenswrapper[4752]: I0227 17:48:53.368867 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qpbx6_098f70a1-c2c2-44ce-9c0c-356e7eea2da9/kube-multus/2.log" Feb 27 17:48:53 crc kubenswrapper[4752]: I0227 17:48:53.369171 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qpbx6" event={"ID":"098f70a1-c2c2-44ce-9c0c-356e7eea2da9","Type":"ContainerStarted","Data":"fd61f8a3882d8af3ee8b1bee2e42421fdfc1d15fe2d52888f3115dcb0737a8b1"} Feb 27 17:48:56 crc kubenswrapper[4752]: I0227 17:48:56.322775 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8h42x" Feb 27 17:48:59 crc kubenswrapper[4752]: E0227 17:48:59.910796 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536908-zvbsb" podUID="4babdb15-835b-4965-9af0-4a697c85f645" Feb 27 17:49:02 crc kubenswrapper[4752]: I0227 17:49:02.037761 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82kfvph"] Feb 27 17:49:02 crc kubenswrapper[4752]: I0227 17:49:02.039329 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82kfvph" Feb 27 17:49:02 crc kubenswrapper[4752]: I0227 17:49:02.041436 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 27 17:49:02 crc kubenswrapper[4752]: I0227 17:49:02.051447 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82kfvph"] Feb 27 17:49:02 crc kubenswrapper[4752]: I0227 17:49:02.107447 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82kfvph\" (UID: \"cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82kfvph" Feb 27 17:49:02 crc kubenswrapper[4752]: I0227 17:49:02.107608 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82kfvph\" (UID: \"cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82kfvph" Feb 27 17:49:02 crc kubenswrapper[4752]: I0227 17:49:02.107739 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdzt9\" (UniqueName: \"kubernetes.io/projected/cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3-kube-api-access-wdzt9\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82kfvph\" (UID: \"cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82kfvph" Feb 27 17:49:02 crc kubenswrapper[4752]: I0227 17:49:02.208723 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82kfvph\" (UID: \"cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82kfvph" Feb 27 17:49:02 crc kubenswrapper[4752]: I0227 17:49:02.208792 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdzt9\" (UniqueName: \"kubernetes.io/projected/cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3-kube-api-access-wdzt9\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82kfvph\" (UID: \"cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82kfvph" Feb 27 17:49:02 crc kubenswrapper[4752]: I0227 17:49:02.208879 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82kfvph\" (UID: \"cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82kfvph" Feb 27 17:49:02 crc kubenswrapper[4752]: I0227 17:49:02.209503 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82kfvph\" (UID: \"cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82kfvph" Feb 27 17:49:02 crc kubenswrapper[4752]: I0227 17:49:02.209647 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82kfvph\" (UID: \"cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82kfvph" Feb 27 17:49:02 crc kubenswrapper[4752]: I0227 17:49:02.245680 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdzt9\" (UniqueName: \"kubernetes.io/projected/cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3-kube-api-access-wdzt9\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82kfvph\" (UID: \"cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82kfvph" Feb 27 17:49:02 crc kubenswrapper[4752]: I0227 17:49:02.353612 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82kfvph" Feb 27 17:49:02 crc kubenswrapper[4752]: I0227 17:49:02.588182 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82kfvph"] Feb 27 17:49:03 crc kubenswrapper[4752]: I0227 17:49:03.439729 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82kfvph" event={"ID":"cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3","Type":"ContainerStarted","Data":"2e49d321cef75513cd1147cfdd78d9de382d0af55bf3e60ac7c2883b06c61173"} Feb 27 17:49:03 crc kubenswrapper[4752]: I0227 17:49:03.440236 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82kfvph" event={"ID":"cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3","Type":"ContainerStarted","Data":"25b6ec9db9954592a7c472ac1d432beb3f68a61110c6745aefbec8f9dcc8c624"} Feb 27 17:49:04 crc kubenswrapper[4752]: I0227 17:49:04.447423 4752 generic.go:334] "Generic (PLEG): container finished" podID="cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3" containerID="2e49d321cef75513cd1147cfdd78d9de382d0af55bf3e60ac7c2883b06c61173" exitCode=0 Feb 27 17:49:04 crc kubenswrapper[4752]: I0227 17:49:04.447479 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82kfvph" event={"ID":"cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3","Type":"ContainerDied","Data":"2e49d321cef75513cd1147cfdd78d9de382d0af55bf3e60ac7c2883b06c61173"} Feb 27 17:49:06 crc kubenswrapper[4752]: I0227 17:49:06.323962 4752 patch_prober.go:28] interesting pod/machine-config-daemon-cm8wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 17:49:06 crc kubenswrapper[4752]: I0227 17:49:06.324071 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 17:49:11 crc kubenswrapper[4752]: E0227 17:49:11.909952 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536908-zvbsb" podUID="4babdb15-835b-4965-9af0-4a697c85f645" Feb 27 17:49:18 crc kubenswrapper[4752]: I0227 17:49:18.805127 4752 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 27 17:49:26 crc kubenswrapper[4752]: I0227 17:49:26.636486 4752 generic.go:334] "Generic (PLEG): container finished" podID="cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3" containerID="44ad69253682718f301bd28be7f088952d601689d1730a622620e9de0d8bf22c" exitCode=0 Feb 27 17:49:26 crc kubenswrapper[4752]: I0227 17:49:26.636563 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82kfvph" event={"ID":"cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3","Type":"ContainerDied","Data":"44ad69253682718f301bd28be7f088952d601689d1730a622620e9de0d8bf22c"} Feb 27 17:49:26 crc kubenswrapper[4752]: I0227 17:49:26.908515 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 17:49:26 crc kubenswrapper[4752]: I0227 17:49:26.976290 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b596z"] Feb 27 17:49:26 crc kubenswrapper[4752]: I0227 17:49:26.978112 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b596z" Feb 27 17:49:26 crc kubenswrapper[4752]: I0227 17:49:26.984666 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b596z"] Feb 27 17:49:27 crc kubenswrapper[4752]: I0227 17:49:27.045746 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50828f1d-d0b7-4078-84a2-4050abd233d7-catalog-content\") pod \"redhat-operators-b596z\" (UID: \"50828f1d-d0b7-4078-84a2-4050abd233d7\") " pod="openshift-marketplace/redhat-operators-b596z" Feb 27 17:49:27 crc kubenswrapper[4752]: I0227 17:49:27.045815 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlgcq\" (UniqueName: \"kubernetes.io/projected/50828f1d-d0b7-4078-84a2-4050abd233d7-kube-api-access-dlgcq\") pod \"redhat-operators-b596z\" (UID: \"50828f1d-d0b7-4078-84a2-4050abd233d7\") " pod="openshift-marketplace/redhat-operators-b596z" Feb 27 17:49:27 crc kubenswrapper[4752]: I0227 17:49:27.045925 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50828f1d-d0b7-4078-84a2-4050abd233d7-utilities\") pod \"redhat-operators-b596z\" (UID: \"50828f1d-d0b7-4078-84a2-4050abd233d7\") " pod="openshift-marketplace/redhat-operators-b596z" Feb 27 17:49:27 crc kubenswrapper[4752]: I0227 17:49:27.146648 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50828f1d-d0b7-4078-84a2-4050abd233d7-catalog-content\") pod \"redhat-operators-b596z\" (UID: \"50828f1d-d0b7-4078-84a2-4050abd233d7\") " pod="openshift-marketplace/redhat-operators-b596z" Feb 27 17:49:27 crc kubenswrapper[4752]: I0227 17:49:27.146716 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlgcq\" (UniqueName: \"kubernetes.io/projected/50828f1d-d0b7-4078-84a2-4050abd233d7-kube-api-access-dlgcq\") pod \"redhat-operators-b596z\" (UID: \"50828f1d-d0b7-4078-84a2-4050abd233d7\") " pod="openshift-marketplace/redhat-operators-b596z" Feb 27 17:49:27 crc kubenswrapper[4752]: I0227 17:49:27.146783 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50828f1d-d0b7-4078-84a2-4050abd233d7-utilities\") pod \"redhat-operators-b596z\" (UID: \"50828f1d-d0b7-4078-84a2-4050abd233d7\") " pod="openshift-marketplace/redhat-operators-b596z" Feb 27 17:49:27 crc kubenswrapper[4752]: I0227 17:49:27.147340 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50828f1d-d0b7-4078-84a2-4050abd233d7-utilities\") pod \"redhat-operators-b596z\" (UID: \"50828f1d-d0b7-4078-84a2-4050abd233d7\") " pod="openshift-marketplace/redhat-operators-b596z" Feb 27 17:49:27 crc kubenswrapper[4752]: I0227 17:49:27.147516 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50828f1d-d0b7-4078-84a2-4050abd233d7-catalog-content\") pod \"redhat-operators-b596z\" (UID: \"50828f1d-d0b7-4078-84a2-4050abd233d7\") " pod="openshift-marketplace/redhat-operators-b596z" Feb 27 17:49:27 crc kubenswrapper[4752]: I0227 17:49:27.168017 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlgcq\" (UniqueName: \"kubernetes.io/projected/50828f1d-d0b7-4078-84a2-4050abd233d7-kube-api-access-dlgcq\") pod \"redhat-operators-b596z\" (UID: \"50828f1d-d0b7-4078-84a2-4050abd233d7\") " pod="openshift-marketplace/redhat-operators-b596z" Feb 27 17:49:27 crc kubenswrapper[4752]: I0227 17:49:27.304377 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b596z" Feb 27 17:49:27 crc kubenswrapper[4752]: I0227 17:49:27.644782 4752 generic.go:334] "Generic (PLEG): container finished" podID="cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3" containerID="bd7ce22c90ebbfcf7068735bd52b16d990ef55e39033a1198cfb956fdb4bdf42" exitCode=0 Feb 27 17:49:27 crc kubenswrapper[4752]: I0227 17:49:27.644846 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82kfvph" event={"ID":"cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3","Type":"ContainerDied","Data":"bd7ce22c90ebbfcf7068735bd52b16d990ef55e39033a1198cfb956fdb4bdf42"} Feb 27 17:49:27 crc kubenswrapper[4752]: I0227 17:49:27.723448 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b596z"] Feb 27 17:49:28 crc kubenswrapper[4752]: I0227 17:49:28.656938 4752 generic.go:334] "Generic (PLEG): container finished" podID="50828f1d-d0b7-4078-84a2-4050abd233d7" containerID="5efeded3ca5540aabb73b7c993eae910c191e8139c2554a36b18c527b58b75f0" exitCode=0 Feb 27 17:49:28 crc kubenswrapper[4752]: I0227 17:49:28.657008 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b596z" event={"ID":"50828f1d-d0b7-4078-84a2-4050abd233d7","Type":"ContainerDied","Data":"5efeded3ca5540aabb73b7c993eae910c191e8139c2554a36b18c527b58b75f0"} Feb 27 17:49:28 crc kubenswrapper[4752]: I0227 17:49:28.657475 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b596z" event={"ID":"50828f1d-d0b7-4078-84a2-4050abd233d7","Type":"ContainerStarted","Data":"a972eb547008e7657893573d47c945afcd545ab572a8dd7424d937f78aa81483"} Feb 27 17:49:28 crc kubenswrapper[4752]: I0227 17:49:28.660176 4752 generic.go:334] "Generic (PLEG): container finished" podID="4babdb15-835b-4965-9af0-4a697c85f645" containerID="056b82ec81a7f51986d7257c0e67aa075fbce8d6a82e958f79db2c3bda2fe32a" exitCode=0 Feb 27 17:49:28 crc kubenswrapper[4752]: I0227 17:49:28.660458 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536908-zvbsb" event={"ID":"4babdb15-835b-4965-9af0-4a697c85f645","Type":"ContainerDied","Data":"056b82ec81a7f51986d7257c0e67aa075fbce8d6a82e958f79db2c3bda2fe32a"} Feb 27 17:49:28 crc kubenswrapper[4752]: I0227 17:49:28.952993 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82kfvph" Feb 27 17:49:29 crc kubenswrapper[4752]: I0227 17:49:29.068662 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdzt9\" (UniqueName: \"kubernetes.io/projected/cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3-kube-api-access-wdzt9\") pod \"cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3\" (UID: \"cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3\") " Feb 27 17:49:29 crc kubenswrapper[4752]: I0227 17:49:29.068738 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3-util\") pod \"cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3\" (UID: \"cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3\") " Feb 27 17:49:29 crc kubenswrapper[4752]: I0227 17:49:29.068779 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3-bundle\") pod \"cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3\" (UID: \"cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3\") " Feb 27 17:49:29 crc kubenswrapper[4752]: I0227 17:49:29.069832 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3-bundle" (OuterVolumeSpecName: "bundle") pod "cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3" (UID: "cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 17:49:29 crc kubenswrapper[4752]: I0227 17:49:29.079381 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3-kube-api-access-wdzt9" (OuterVolumeSpecName: "kube-api-access-wdzt9") pod "cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3" (UID: "cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3"). InnerVolumeSpecName "kube-api-access-wdzt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:49:29 crc kubenswrapper[4752]: I0227 17:49:29.084361 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3-util" (OuterVolumeSpecName: "util") pod "cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3" (UID: "cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 17:49:29 crc kubenswrapper[4752]: I0227 17:49:29.170524 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdzt9\" (UniqueName: \"kubernetes.io/projected/cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3-kube-api-access-wdzt9\") on node \"crc\" DevicePath \"\"" Feb 27 17:49:29 crc kubenswrapper[4752]: I0227 17:49:29.170573 4752 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3-util\") on node \"crc\" DevicePath \"\"" Feb 27 17:49:29 crc kubenswrapper[4752]: I0227 17:49:29.170592 4752 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 17:49:29 crc kubenswrapper[4752]: I0227 17:49:29.673553 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82kfvph" Feb 27 17:49:29 crc kubenswrapper[4752]: I0227 17:49:29.673696 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82kfvph" event={"ID":"cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3","Type":"ContainerDied","Data":"25b6ec9db9954592a7c472ac1d432beb3f68a61110c6745aefbec8f9dcc8c624"} Feb 27 17:49:29 crc kubenswrapper[4752]: I0227 17:49:29.673749 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25b6ec9db9954592a7c472ac1d432beb3f68a61110c6745aefbec8f9dcc8c624" Feb 27 17:49:30 crc kubenswrapper[4752]: I0227 17:49:30.016198 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536908-zvbsb" Feb 27 17:49:30 crc kubenswrapper[4752]: I0227 17:49:30.083417 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbqcp\" (UniqueName: \"kubernetes.io/projected/4babdb15-835b-4965-9af0-4a697c85f645-kube-api-access-mbqcp\") pod \"4babdb15-835b-4965-9af0-4a697c85f645\" (UID: \"4babdb15-835b-4965-9af0-4a697c85f645\") " Feb 27 17:49:30 crc kubenswrapper[4752]: I0227 17:49:30.093388 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4babdb15-835b-4965-9af0-4a697c85f645-kube-api-access-mbqcp" (OuterVolumeSpecName: "kube-api-access-mbqcp") pod "4babdb15-835b-4965-9af0-4a697c85f645" (UID: "4babdb15-835b-4965-9af0-4a697c85f645"). InnerVolumeSpecName "kube-api-access-mbqcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:49:30 crc kubenswrapper[4752]: I0227 17:49:30.184288 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbqcp\" (UniqueName: \"kubernetes.io/projected/4babdb15-835b-4965-9af0-4a697c85f645-kube-api-access-mbqcp\") on node \"crc\" DevicePath \"\"" Feb 27 17:49:30 crc kubenswrapper[4752]: I0227 17:49:30.683101 4752 generic.go:334] "Generic (PLEG): container finished" podID="50828f1d-d0b7-4078-84a2-4050abd233d7" containerID="e403c6de74fb514269f4a24845983fa1e9fce30afc6abfc36d121e75bcf0d2f6" exitCode=0 Feb 27 17:49:30 crc kubenswrapper[4752]: I0227 17:49:30.683195 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b596z" event={"ID":"50828f1d-d0b7-4078-84a2-4050abd233d7","Type":"ContainerDied","Data":"e403c6de74fb514269f4a24845983fa1e9fce30afc6abfc36d121e75bcf0d2f6"} Feb 27 17:49:30 crc kubenswrapper[4752]: I0227 17:49:30.688741 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536908-zvbsb" event={"ID":"4babdb15-835b-4965-9af0-4a697c85f645","Type":"ContainerDied","Data":"ded5fe56ed707bec1cabd17f2d8b407c4e8d22f5642c6b5960a86c1d9a6c5f7d"} Feb 27 17:49:30 crc kubenswrapper[4752]: I0227 17:49:30.688799 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ded5fe56ed707bec1cabd17f2d8b407c4e8d22f5642c6b5960a86c1d9a6c5f7d" Feb 27 17:49:30 crc kubenswrapper[4752]: I0227 17:49:30.688878 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536908-zvbsb" Feb 27 17:49:31 crc kubenswrapper[4752]: I0227 17:49:31.084327 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536902-bmcrh"] Feb 27 17:49:31 crc kubenswrapper[4752]: I0227 17:49:31.091570 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536902-bmcrh"] Feb 27 17:49:31 crc kubenswrapper[4752]: I0227 17:49:31.697767 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b596z" event={"ID":"50828f1d-d0b7-4078-84a2-4050abd233d7","Type":"ContainerStarted","Data":"38dedc679507df1df92c4d29a4a55db0403f762125da08ff8b0a0dfd04eb607f"} Feb 27 17:49:31 crc kubenswrapper[4752]: I0227 17:49:31.720099 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b596z" podStartSLOduration=3.246113439 podStartE2EDuration="5.720078035s" podCreationTimestamp="2026-02-27 17:49:26 +0000 UTC" firstStartedPulling="2026-02-27 17:49:28.658843919 +0000 UTC m=+868.565660790" lastFinishedPulling="2026-02-27 17:49:31.132808495 +0000 UTC m=+871.039625386" observedRunningTime="2026-02-27 17:49:31.718443954 +0000 UTC m=+871.625260815" watchObservedRunningTime="2026-02-27 17:49:31.720078035 +0000 UTC m=+871.626894886" Feb 27 17:49:32 crc kubenswrapper[4752]: I0227 17:49:32.852439 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-64src"] Feb 27 17:49:32 crc kubenswrapper[4752]: E0227 17:49:32.852651 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3" containerName="pull" Feb 27 17:49:32 crc kubenswrapper[4752]: I0227 17:49:32.852662 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3" containerName="pull" Feb 27 17:49:32 crc kubenswrapper[4752]: E0227 17:49:32.852675 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4babdb15-835b-4965-9af0-4a697c85f645" containerName="oc" Feb 27 17:49:32 crc kubenswrapper[4752]: I0227 17:49:32.852681 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="4babdb15-835b-4965-9af0-4a697c85f645" containerName="oc" Feb 27 17:49:32 crc kubenswrapper[4752]: E0227 17:49:32.852689 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3" containerName="extract" Feb 27 17:49:32 crc kubenswrapper[4752]: I0227 17:49:32.852695 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3" containerName="extract" Feb 27 17:49:32 crc kubenswrapper[4752]: E0227 17:49:32.852706 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3" containerName="util" Feb 27 17:49:32 crc kubenswrapper[4752]: I0227 17:49:32.852712 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3" containerName="util" Feb 27 17:49:32 crc kubenswrapper[4752]: I0227 17:49:32.852800 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3" containerName="extract" Feb 27 17:49:32 crc kubenswrapper[4752]: I0227 17:49:32.852811 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="4babdb15-835b-4965-9af0-4a697c85f645" containerName="oc" Feb 27 17:49:32 crc kubenswrapper[4752]: I0227 17:49:32.853118 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-64src" Feb 27 17:49:32 crc kubenswrapper[4752]: I0227 17:49:32.854941 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 27 17:49:32 crc kubenswrapper[4752]: I0227 17:49:32.855064 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-zxtdk" Feb 27 17:49:32 crc kubenswrapper[4752]: I0227 17:49:32.855066 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 27 17:49:32 crc kubenswrapper[4752]: I0227 17:49:32.866554 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-64src"] Feb 27 17:49:32 crc kubenswrapper[4752]: I0227 17:49:32.912930 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d475667f-7381-41d5-9e84-e20e48cef57e" path="/var/lib/kubelet/pods/d475667f-7381-41d5-9e84-e20e48cef57e/volumes" Feb 27 17:49:32 crc kubenswrapper[4752]: I0227 17:49:32.927409 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdhzb\" (UniqueName: \"kubernetes.io/projected/696dcdcd-c551-48ea-853c-2797a874fdaa-kube-api-access-jdhzb\") pod \"nmstate-operator-75c5dccd6c-64src\" (UID: \"696dcdcd-c551-48ea-853c-2797a874fdaa\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-64src" Feb 27 17:49:33 crc kubenswrapper[4752]: I0227 17:49:33.028651 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdhzb\" (UniqueName: \"kubernetes.io/projected/696dcdcd-c551-48ea-853c-2797a874fdaa-kube-api-access-jdhzb\") pod \"nmstate-operator-75c5dccd6c-64src\" (UID: \"696dcdcd-c551-48ea-853c-2797a874fdaa\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-64src" Feb 27 17:49:33 crc kubenswrapper[4752]: I0227 17:49:33.045127 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdhzb\" (UniqueName: \"kubernetes.io/projected/696dcdcd-c551-48ea-853c-2797a874fdaa-kube-api-access-jdhzb\") pod \"nmstate-operator-75c5dccd6c-64src\" (UID: \"696dcdcd-c551-48ea-853c-2797a874fdaa\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-64src" Feb 27 17:49:33 crc kubenswrapper[4752]: I0227 17:49:33.166564 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-64src" Feb 27 17:49:33 crc kubenswrapper[4752]: I0227 17:49:33.563272 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-64src"] Feb 27 17:49:33 crc kubenswrapper[4752]: W0227 17:49:33.571745 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod696dcdcd_c551_48ea_853c_2797a874fdaa.slice/crio-4a42cb45601313ff59911200eb0e3ac615546c15094dc37584d6480acb255c5c WatchSource:0}: Error finding container 4a42cb45601313ff59911200eb0e3ac615546c15094dc37584d6480acb255c5c: Status 404 returned error can't find the container with id 4a42cb45601313ff59911200eb0e3ac615546c15094dc37584d6480acb255c5c Feb 27 17:49:33 crc kubenswrapper[4752]: I0227 17:49:33.707625 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-64src" event={"ID":"696dcdcd-c551-48ea-853c-2797a874fdaa","Type":"ContainerStarted","Data":"4a42cb45601313ff59911200eb0e3ac615546c15094dc37584d6480acb255c5c"} Feb 27 17:49:36 crc kubenswrapper[4752]: I0227 17:49:36.323431 4752 patch_prober.go:28] interesting pod/machine-config-daemon-cm8wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 17:49:36 crc kubenswrapper[4752]: I0227 17:49:36.323520 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 17:49:36 crc kubenswrapper[4752]: I0227 17:49:36.323588 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" Feb 27 17:49:36 crc kubenswrapper[4752]: I0227 17:49:36.324461 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a53f865de5bd7bff88a289306c2cd6d9f814402e7d74b87753c7f92b7f4a7a83"} pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 17:49:36 crc kubenswrapper[4752]: I0227 17:49:36.324564 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" containerName="machine-config-daemon" containerID="cri-o://a53f865de5bd7bff88a289306c2cd6d9f814402e7d74b87753c7f92b7f4a7a83" gracePeriod=600 Feb 27 17:49:37 crc kubenswrapper[4752]: I0227 17:49:37.305591 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-b596z" Feb 27 17:49:37 crc kubenswrapper[4752]: I0227 17:49:37.305685 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b596z" Feb 27 17:49:37 crc kubenswrapper[4752]: I0227 17:49:37.740327 4752 generic.go:334] "Generic (PLEG): container finished" podID="53ce186c-640f-4ade-94e1-587c1440fe87" containerID="a53f865de5bd7bff88a289306c2cd6d9f814402e7d74b87753c7f92b7f4a7a83" exitCode=0 Feb 27 17:49:37 crc kubenswrapper[4752]: I0227 17:49:37.740410 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" event={"ID":"53ce186c-640f-4ade-94e1-587c1440fe87","Type":"ContainerDied","Data":"a53f865de5bd7bff88a289306c2cd6d9f814402e7d74b87753c7f92b7f4a7a83"} Feb 27 17:49:37 crc kubenswrapper[4752]: I0227 17:49:37.740748 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" event={"ID":"53ce186c-640f-4ade-94e1-587c1440fe87","Type":"ContainerStarted","Data":"9dcaf0a23a37ae06ee6e0942e328a8ecbd6e4ed2d990fbabc5b07156fcd3f846"} Feb 27 17:49:37 crc kubenswrapper[4752]: I0227 17:49:37.740775 4752 scope.go:117] "RemoveContainer" containerID="4e626018c1edbe3730d4f3d103fde91f98edb2e73f244e25466610f806bc6269" Feb 27 17:49:38 crc kubenswrapper[4752]: I0227 17:49:38.363203 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-b596z" podUID="50828f1d-d0b7-4078-84a2-4050abd233d7" containerName="registry-server" probeResult="failure" output=< Feb 27 17:49:38 crc kubenswrapper[4752]: timeout: failed to connect service ":50051" within 1s Feb 27 17:49:38 crc kubenswrapper[4752]: > Feb 27 17:49:47 crc kubenswrapper[4752]: I0227 17:49:47.369890 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b596z" Feb 27 17:49:47 crc kubenswrapper[4752]: I0227 17:49:47.433125 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b596z" Feb 27 17:49:47 crc kubenswrapper[4752]: I0227 17:49:47.617067 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b596z"] Feb 27 17:49:48 crc kubenswrapper[4752]: I0227 17:49:48.838083 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-b596z" podUID="50828f1d-d0b7-4078-84a2-4050abd233d7" containerName="registry-server" containerID="cri-o://38dedc679507df1df92c4d29a4a55db0403f762125da08ff8b0a0dfd04eb607f" gracePeriod=2 Feb 27 17:49:49 crc kubenswrapper[4752]: I0227 17:49:49.306489 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b596z" Feb 27 17:49:49 crc kubenswrapper[4752]: I0227 17:49:49.354826 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlgcq\" (UniqueName: \"kubernetes.io/projected/50828f1d-d0b7-4078-84a2-4050abd233d7-kube-api-access-dlgcq\") pod \"50828f1d-d0b7-4078-84a2-4050abd233d7\" (UID: \"50828f1d-d0b7-4078-84a2-4050abd233d7\") " Feb 27 17:49:49 crc kubenswrapper[4752]: I0227 17:49:49.354911 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50828f1d-d0b7-4078-84a2-4050abd233d7-utilities\") pod \"50828f1d-d0b7-4078-84a2-4050abd233d7\" (UID: \"50828f1d-d0b7-4078-84a2-4050abd233d7\") " Feb 27 17:49:49 crc kubenswrapper[4752]: I0227 17:49:49.354984 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50828f1d-d0b7-4078-84a2-4050abd233d7-catalog-content\") pod \"50828f1d-d0b7-4078-84a2-4050abd233d7\" (UID: \"50828f1d-d0b7-4078-84a2-4050abd233d7\") " Feb 27 17:49:49 crc kubenswrapper[4752]: I0227 17:49:49.356944 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50828f1d-d0b7-4078-84a2-4050abd233d7-utilities" (OuterVolumeSpecName: "utilities") pod "50828f1d-d0b7-4078-84a2-4050abd233d7" (UID: "50828f1d-d0b7-4078-84a2-4050abd233d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 17:49:49 crc kubenswrapper[4752]: I0227 17:49:49.362115 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50828f1d-d0b7-4078-84a2-4050abd233d7-kube-api-access-dlgcq" (OuterVolumeSpecName: "kube-api-access-dlgcq") pod "50828f1d-d0b7-4078-84a2-4050abd233d7" (UID: "50828f1d-d0b7-4078-84a2-4050abd233d7"). InnerVolumeSpecName "kube-api-access-dlgcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:49:49 crc kubenswrapper[4752]: I0227 17:49:49.456350 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlgcq\" (UniqueName: \"kubernetes.io/projected/50828f1d-d0b7-4078-84a2-4050abd233d7-kube-api-access-dlgcq\") on node \"crc\" DevicePath \"\"" Feb 27 17:49:49 crc kubenswrapper[4752]: I0227 17:49:49.456381 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50828f1d-d0b7-4078-84a2-4050abd233d7-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 17:49:49 crc kubenswrapper[4752]: I0227 17:49:49.475421 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50828f1d-d0b7-4078-84a2-4050abd233d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "50828f1d-d0b7-4078-84a2-4050abd233d7" (UID: "50828f1d-d0b7-4078-84a2-4050abd233d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 17:49:49 crc kubenswrapper[4752]: I0227 17:49:49.557712 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50828f1d-d0b7-4078-84a2-4050abd233d7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 17:49:49 crc kubenswrapper[4752]: I0227 17:49:49.857111 4752 generic.go:334] "Generic (PLEG): container finished" podID="50828f1d-d0b7-4078-84a2-4050abd233d7" containerID="38dedc679507df1df92c4d29a4a55db0403f762125da08ff8b0a0dfd04eb607f" exitCode=0 Feb 27 17:49:49 crc kubenswrapper[4752]: I0227 17:49:49.857210 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b596z" event={"ID":"50828f1d-d0b7-4078-84a2-4050abd233d7","Type":"ContainerDied","Data":"38dedc679507df1df92c4d29a4a55db0403f762125da08ff8b0a0dfd04eb607f"} Feb 27 17:49:49 crc kubenswrapper[4752]: I0227 17:49:49.857283 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b596z" event={"ID":"50828f1d-d0b7-4078-84a2-4050abd233d7","Type":"ContainerDied","Data":"a972eb547008e7657893573d47c945afcd545ab572a8dd7424d937f78aa81483"} Feb 27 17:49:49 crc kubenswrapper[4752]: I0227 17:49:49.857315 4752 scope.go:117] "RemoveContainer" containerID="38dedc679507df1df92c4d29a4a55db0403f762125da08ff8b0a0dfd04eb607f" Feb 27 17:49:49 crc kubenswrapper[4752]: I0227 17:49:49.857229 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b596z" Feb 27 17:49:49 crc kubenswrapper[4752]: I0227 17:49:49.887765 4752 scope.go:117] "RemoveContainer" containerID="e403c6de74fb514269f4a24845983fa1e9fce30afc6abfc36d121e75bcf0d2f6" Feb 27 17:49:49 crc kubenswrapper[4752]: I0227 17:49:49.919211 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b596z"] Feb 27 17:49:49 crc kubenswrapper[4752]: I0227 17:49:49.927406 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-b596z"] Feb 27 17:49:49 crc kubenswrapper[4752]: I0227 17:49:49.932786 4752 scope.go:117] "RemoveContainer" containerID="5efeded3ca5540aabb73b7c993eae910c191e8139c2554a36b18c527b58b75f0" Feb 27 17:49:49 crc kubenswrapper[4752]: I0227 17:49:49.952540 4752 scope.go:117] "RemoveContainer" containerID="38dedc679507df1df92c4d29a4a55db0403f762125da08ff8b0a0dfd04eb607f" Feb 27 17:49:49 crc kubenswrapper[4752]: E0227 17:49:49.953009 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38dedc679507df1df92c4d29a4a55db0403f762125da08ff8b0a0dfd04eb607f\": container with ID starting with 38dedc679507df1df92c4d29a4a55db0403f762125da08ff8b0a0dfd04eb607f not found: ID does not exist" containerID="38dedc679507df1df92c4d29a4a55db0403f762125da08ff8b0a0dfd04eb607f" Feb 27 17:49:49 crc kubenswrapper[4752]: I0227 17:49:49.953072 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38dedc679507df1df92c4d29a4a55db0403f762125da08ff8b0a0dfd04eb607f"} err="failed to get container status \"38dedc679507df1df92c4d29a4a55db0403f762125da08ff8b0a0dfd04eb607f\": rpc error: code = NotFound desc = could not find container \"38dedc679507df1df92c4d29a4a55db0403f762125da08ff8b0a0dfd04eb607f\": container with ID starting with 38dedc679507df1df92c4d29a4a55db0403f762125da08ff8b0a0dfd04eb607f not found: ID does not exist" Feb 27 17:49:49 crc kubenswrapper[4752]: I0227 17:49:49.953113 4752 scope.go:117] "RemoveContainer" containerID="e403c6de74fb514269f4a24845983fa1e9fce30afc6abfc36d121e75bcf0d2f6" Feb 27 17:49:49 crc kubenswrapper[4752]: E0227 17:49:49.953782 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e403c6de74fb514269f4a24845983fa1e9fce30afc6abfc36d121e75bcf0d2f6\": container with ID starting with e403c6de74fb514269f4a24845983fa1e9fce30afc6abfc36d121e75bcf0d2f6 not found: ID does not exist" containerID="e403c6de74fb514269f4a24845983fa1e9fce30afc6abfc36d121e75bcf0d2f6" Feb 27 17:49:49 crc kubenswrapper[4752]: I0227 17:49:49.953851 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e403c6de74fb514269f4a24845983fa1e9fce30afc6abfc36d121e75bcf0d2f6"} err="failed to get container status \"e403c6de74fb514269f4a24845983fa1e9fce30afc6abfc36d121e75bcf0d2f6\": rpc error: code = NotFound desc = could not find container \"e403c6de74fb514269f4a24845983fa1e9fce30afc6abfc36d121e75bcf0d2f6\": container with ID starting with e403c6de74fb514269f4a24845983fa1e9fce30afc6abfc36d121e75bcf0d2f6 not found: ID does not exist" Feb 27 17:49:49 crc kubenswrapper[4752]: I0227 17:49:49.953895 4752 scope.go:117] "RemoveContainer" containerID="5efeded3ca5540aabb73b7c993eae910c191e8139c2554a36b18c527b58b75f0" Feb 27 17:49:49 crc kubenswrapper[4752]: E0227 17:49:49.954443 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5efeded3ca5540aabb73b7c993eae910c191e8139c2554a36b18c527b58b75f0\": container with ID starting with 5efeded3ca5540aabb73b7c993eae910c191e8139c2554a36b18c527b58b75f0 not found: ID does not exist" containerID="5efeded3ca5540aabb73b7c993eae910c191e8139c2554a36b18c527b58b75f0" Feb 27 17:49:49 crc kubenswrapper[4752]: I0227 17:49:49.954507 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5efeded3ca5540aabb73b7c993eae910c191e8139c2554a36b18c527b58b75f0"} err="failed to get container status \"5efeded3ca5540aabb73b7c993eae910c191e8139c2554a36b18c527b58b75f0\": rpc error: code = NotFound desc = could not find container \"5efeded3ca5540aabb73b7c993eae910c191e8139c2554a36b18c527b58b75f0\": container with ID starting with 5efeded3ca5540aabb73b7c993eae910c191e8139c2554a36b18c527b58b75f0 not found: ID does not exist" Feb 27 17:49:50 crc kubenswrapper[4752]: I0227 17:49:50.914892 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50828f1d-d0b7-4078-84a2-4050abd233d7" path="/var/lib/kubelet/pods/50828f1d-d0b7-4078-84a2-4050abd233d7/volumes" Feb 27 17:50:00 crc kubenswrapper[4752]: I0227 17:50:00.156660 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536910-lld75"] Feb 27 17:50:00 crc kubenswrapper[4752]: E0227 17:50:00.157803 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50828f1d-d0b7-4078-84a2-4050abd233d7" containerName="extract-utilities" Feb 27 17:50:00 crc kubenswrapper[4752]: I0227 17:50:00.157830 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="50828f1d-d0b7-4078-84a2-4050abd233d7" containerName="extract-utilities" Feb 27 17:50:00 crc kubenswrapper[4752]: E0227 17:50:00.157855 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50828f1d-d0b7-4078-84a2-4050abd233d7" containerName="registry-server" Feb 27 17:50:00 crc kubenswrapper[4752]: I0227 17:50:00.157867 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="50828f1d-d0b7-4078-84a2-4050abd233d7" containerName="registry-server" Feb 27 17:50:00 crc kubenswrapper[4752]: E0227 17:50:00.157888 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50828f1d-d0b7-4078-84a2-4050abd233d7" containerName="extract-content" Feb 27 17:50:00 crc kubenswrapper[4752]: I0227 17:50:00.157901 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="50828f1d-d0b7-4078-84a2-4050abd233d7" containerName="extract-content" Feb 27 17:50:00 crc kubenswrapper[4752]: I0227 17:50:00.158064 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="50828f1d-d0b7-4078-84a2-4050abd233d7" containerName="registry-server" Feb 27 17:50:00 crc kubenswrapper[4752]: I0227 17:50:00.158799 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536910-lld75" Feb 27 17:50:00 crc kubenswrapper[4752]: I0227 17:50:00.161998 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 17:50:00 crc kubenswrapper[4752]: I0227 17:50:00.168074 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wtt6d" Feb 27 17:50:00 crc kubenswrapper[4752]: I0227 17:50:00.170796 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536910-lld75"] Feb 27 17:50:00 crc kubenswrapper[4752]: I0227 17:50:00.173558 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 17:50:00 crc kubenswrapper[4752]: I0227 17:50:00.197327 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb8pf\" (UniqueName: \"kubernetes.io/projected/3e9fc9d7-94c5-4b1f-ab54-13183ac41df4-kube-api-access-zb8pf\") pod \"auto-csr-approver-29536910-lld75\" (UID: \"3e9fc9d7-94c5-4b1f-ab54-13183ac41df4\") " pod="openshift-infra/auto-csr-approver-29536910-lld75" Feb 27 17:50:00 crc kubenswrapper[4752]: I0227 17:50:00.299699 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb8pf\" (UniqueName: \"kubernetes.io/projected/3e9fc9d7-94c5-4b1f-ab54-13183ac41df4-kube-api-access-zb8pf\") pod \"auto-csr-approver-29536910-lld75\" (UID: \"3e9fc9d7-94c5-4b1f-ab54-13183ac41df4\") " pod="openshift-infra/auto-csr-approver-29536910-lld75" Feb 27 17:50:00 crc kubenswrapper[4752]: I0227 17:50:00.326049 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb8pf\" (UniqueName: \"kubernetes.io/projected/3e9fc9d7-94c5-4b1f-ab54-13183ac41df4-kube-api-access-zb8pf\") pod \"auto-csr-approver-29536910-lld75\" (UID: \"3e9fc9d7-94c5-4b1f-ab54-13183ac41df4\") " pod="openshift-infra/auto-csr-approver-29536910-lld75" Feb 27 17:50:00 crc kubenswrapper[4752]: I0227 17:50:00.489847 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536910-lld75" Feb 27 17:50:00 crc kubenswrapper[4752]: I0227 17:50:00.747655 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536910-lld75"] Feb 27 17:50:00 crc kubenswrapper[4752]: I0227 17:50:00.928050 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536910-lld75" event={"ID":"3e9fc9d7-94c5-4b1f-ab54-13183ac41df4","Type":"ContainerStarted","Data":"24d0d2c589e2519a1b01de74c405f5a51af9855262d2d5e7d11643545b73c331"} Feb 27 17:50:02 crc kubenswrapper[4752]: I0227 17:50:02.943656 4752 generic.go:334] "Generic (PLEG): container finished" podID="3e9fc9d7-94c5-4b1f-ab54-13183ac41df4" containerID="a066bde52a9e7ad432fc41bc146e4b4848a1a43df3324b8c04e110260728c084" exitCode=0 Feb 27 17:50:02 crc kubenswrapper[4752]: I0227 17:50:02.943781 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536910-lld75" event={"ID":"3e9fc9d7-94c5-4b1f-ab54-13183ac41df4","Type":"ContainerDied","Data":"a066bde52a9e7ad432fc41bc146e4b4848a1a43df3324b8c04e110260728c084"} Feb 27 17:50:02 crc kubenswrapper[4752]: I0227 17:50:02.947718 4752 scope.go:117] "RemoveContainer" containerID="b361613190f429b39ca0c0063f059f538ca42e9326823d5e46a5e6ff925985d2" Feb 27 17:50:03 crc kubenswrapper[4752]: I0227 17:50:03.956889 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-64src" event={"ID":"696dcdcd-c551-48ea-853c-2797a874fdaa","Type":"ContainerStarted","Data":"340ace93b3773c196eb3c2bb7f9aebeab519c15098cf87c5428473344c821fa9"} Feb 27 17:50:03 crc kubenswrapper[4752]: I0227 17:50:03.982545 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-64src" podStartSLOduration=1.861983719 podStartE2EDuration="31.982517176s" podCreationTimestamp="2026-02-27 17:49:32 +0000 UTC" firstStartedPulling="2026-02-27 17:49:33.573913067 +0000 UTC m=+873.480729938" lastFinishedPulling="2026-02-27 17:50:03.694446544 +0000 UTC m=+903.601263395" observedRunningTime="2026-02-27 17:50:03.978796413 +0000 UTC m=+903.885613334" watchObservedRunningTime="2026-02-27 17:50:03.982517176 +0000 UTC m=+903.889334067" Feb 27 17:50:04 crc kubenswrapper[4752]: I0227 17:50:04.253069 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536910-lld75" Feb 27 17:50:04 crc kubenswrapper[4752]: I0227 17:50:04.256791 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zb8pf\" (UniqueName: \"kubernetes.io/projected/3e9fc9d7-94c5-4b1f-ab54-13183ac41df4-kube-api-access-zb8pf\") pod \"3e9fc9d7-94c5-4b1f-ab54-13183ac41df4\" (UID: \"3e9fc9d7-94c5-4b1f-ab54-13183ac41df4\") " Feb 27 17:50:04 crc kubenswrapper[4752]: I0227 17:50:04.267278 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e9fc9d7-94c5-4b1f-ab54-13183ac41df4-kube-api-access-zb8pf" (OuterVolumeSpecName: "kube-api-access-zb8pf") pod "3e9fc9d7-94c5-4b1f-ab54-13183ac41df4" (UID: "3e9fc9d7-94c5-4b1f-ab54-13183ac41df4"). InnerVolumeSpecName "kube-api-access-zb8pf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:50:04 crc kubenswrapper[4752]: I0227 17:50:04.358038 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zb8pf\" (UniqueName: \"kubernetes.io/projected/3e9fc9d7-94c5-4b1f-ab54-13183ac41df4-kube-api-access-zb8pf\") on node \"crc\" DevicePath \"\"" Feb 27 17:50:04 crc kubenswrapper[4752]: I0227 17:50:04.964711 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536910-lld75" event={"ID":"3e9fc9d7-94c5-4b1f-ab54-13183ac41df4","Type":"ContainerDied","Data":"24d0d2c589e2519a1b01de74c405f5a51af9855262d2d5e7d11643545b73c331"} Feb 27 17:50:04 crc kubenswrapper[4752]: I0227 17:50:04.965463 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24d0d2c589e2519a1b01de74c405f5a51af9855262d2d5e7d11643545b73c331" Feb 27 17:50:04 crc kubenswrapper[4752]: I0227 17:50:04.964752 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536910-lld75" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.009651 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-k6nhg"] Feb 27 17:50:05 crc kubenswrapper[4752]: E0227 17:50:05.010020 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e9fc9d7-94c5-4b1f-ab54-13183ac41df4" containerName="oc" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.010048 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e9fc9d7-94c5-4b1f-ab54-13183ac41df4" containerName="oc" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.010255 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e9fc9d7-94c5-4b1f-ab54-13183ac41df4" containerName="oc" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.010878 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-k6nhg" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.018906 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.019184 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-l2g98"] Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.019320 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-wwwl9" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.020375 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-l2g98" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.028617 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-k6nhg"] Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.037781 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-l2g98"] Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.093121 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-7wpqr"] Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.108245 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-7wpqr" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.161350 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-v5s5z"] Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.162059 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-v5s5z" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.168110 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.168232 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-9r5hg" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.168424 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.169631 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhbnm\" (UniqueName: \"kubernetes.io/projected/e3dda850-6a0b-454f-aeda-b11f6c5a7604-kube-api-access-dhbnm\") pod \"nmstate-webhook-786f45cff4-k6nhg\" (UID: \"e3dda850-6a0b-454f-aeda-b11f6c5a7604\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-k6nhg" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.169663 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5fc4e66a-972d-4516-96ba-fa4b56a181a0-ovs-socket\") pod \"nmstate-handler-7wpqr\" (UID: \"5fc4e66a-972d-4516-96ba-fa4b56a181a0\") " pod="openshift-nmstate/nmstate-handler-7wpqr" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.169692 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5fc4e66a-972d-4516-96ba-fa4b56a181a0-nmstate-lock\") pod \"nmstate-handler-7wpqr\" (UID: \"5fc4e66a-972d-4516-96ba-fa4b56a181a0\") " pod="openshift-nmstate/nmstate-handler-7wpqr" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.169708 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm9bv\" (UniqueName: \"kubernetes.io/projected/5fc4e66a-972d-4516-96ba-fa4b56a181a0-kube-api-access-xm9bv\") pod \"nmstate-handler-7wpqr\" (UID: \"5fc4e66a-972d-4516-96ba-fa4b56a181a0\") " pod="openshift-nmstate/nmstate-handler-7wpqr" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.169737 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b08a312-7e68-44c9-831e-9cc02cb723c1-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-v5s5z\" (UID: \"2b08a312-7e68-44c9-831e-9cc02cb723c1\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-v5s5z" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.169757 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5fc4e66a-972d-4516-96ba-fa4b56a181a0-dbus-socket\") pod \"nmstate-handler-7wpqr\" (UID: \"5fc4e66a-972d-4516-96ba-fa4b56a181a0\") " pod="openshift-nmstate/nmstate-handler-7wpqr" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.169774 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2b08a312-7e68-44c9-831e-9cc02cb723c1-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-v5s5z\" (UID: \"2b08a312-7e68-44c9-831e-9cc02cb723c1\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-v5s5z" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.169797 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e3dda850-6a0b-454f-aeda-b11f6c5a7604-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-k6nhg\" (UID: \"e3dda850-6a0b-454f-aeda-b11f6c5a7604\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-k6nhg" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.169812 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2qz5\" (UniqueName: \"kubernetes.io/projected/2b08a312-7e68-44c9-831e-9cc02cb723c1-kube-api-access-t2qz5\") pod \"nmstate-console-plugin-5dcbbd79cf-v5s5z\" (UID: \"2b08a312-7e68-44c9-831e-9cc02cb723c1\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-v5s5z" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.169833 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28fvz\" (UniqueName: \"kubernetes.io/projected/592dcc36-2b17-4a10-b182-b693490e83c7-kube-api-access-28fvz\") pod \"nmstate-metrics-69594cc75-l2g98\" (UID: \"592dcc36-2b17-4a10-b182-b693490e83c7\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-l2g98" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.177915 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-v5s5z"] Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.270790 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b08a312-7e68-44c9-831e-9cc02cb723c1-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-v5s5z\" (UID: \"2b08a312-7e68-44c9-831e-9cc02cb723c1\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-v5s5z" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.270839 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5fc4e66a-972d-4516-96ba-fa4b56a181a0-dbus-socket\") pod \"nmstate-handler-7wpqr\" (UID: \"5fc4e66a-972d-4516-96ba-fa4b56a181a0\") " pod="openshift-nmstate/nmstate-handler-7wpqr" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.270861 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2b08a312-7e68-44c9-831e-9cc02cb723c1-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-v5s5z\" (UID: \"2b08a312-7e68-44c9-831e-9cc02cb723c1\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-v5s5z" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.270893 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e3dda850-6a0b-454f-aeda-b11f6c5a7604-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-k6nhg\" (UID: \"e3dda850-6a0b-454f-aeda-b11f6c5a7604\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-k6nhg" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.270912 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2qz5\" (UniqueName: \"kubernetes.io/projected/2b08a312-7e68-44c9-831e-9cc02cb723c1-kube-api-access-t2qz5\") pod \"nmstate-console-plugin-5dcbbd79cf-v5s5z\" (UID: \"2b08a312-7e68-44c9-831e-9cc02cb723c1\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-v5s5z" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.270932 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28fvz\" (UniqueName: \"kubernetes.io/projected/592dcc36-2b17-4a10-b182-b693490e83c7-kube-api-access-28fvz\") pod \"nmstate-metrics-69594cc75-l2g98\" (UID: \"592dcc36-2b17-4a10-b182-b693490e83c7\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-l2g98" Feb 27 17:50:05 crc kubenswrapper[4752]: E0227 17:50:05.270948 4752 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Feb 27 17:50:05 crc kubenswrapper[4752]: E0227 17:50:05.271023 4752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b08a312-7e68-44c9-831e-9cc02cb723c1-plugin-serving-cert podName:2b08a312-7e68-44c9-831e-9cc02cb723c1 nodeName:}" failed. No retries permitted until 2026-02-27 17:50:05.771004463 +0000 UTC m=+905.677821314 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/2b08a312-7e68-44c9-831e-9cc02cb723c1-plugin-serving-cert") pod "nmstate-console-plugin-5dcbbd79cf-v5s5z" (UID: "2b08a312-7e68-44c9-831e-9cc02cb723c1") : secret "plugin-serving-cert" not found Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.270965 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhbnm\" (UniqueName: \"kubernetes.io/projected/e3dda850-6a0b-454f-aeda-b11f6c5a7604-kube-api-access-dhbnm\") pod \"nmstate-webhook-786f45cff4-k6nhg\" (UID: \"e3dda850-6a0b-454f-aeda-b11f6c5a7604\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-k6nhg" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.271130 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5fc4e66a-972d-4516-96ba-fa4b56a181a0-ovs-socket\") pod \"nmstate-handler-7wpqr\" (UID: \"5fc4e66a-972d-4516-96ba-fa4b56a181a0\") " pod="openshift-nmstate/nmstate-handler-7wpqr" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.271189 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5fc4e66a-972d-4516-96ba-fa4b56a181a0-dbus-socket\") pod \"nmstate-handler-7wpqr\" (UID: \"5fc4e66a-972d-4516-96ba-fa4b56a181a0\") " pod="openshift-nmstate/nmstate-handler-7wpqr" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.271222 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5fc4e66a-972d-4516-96ba-fa4b56a181a0-nmstate-lock\") pod \"nmstate-handler-7wpqr\" (UID: \"5fc4e66a-972d-4516-96ba-fa4b56a181a0\") " pod="openshift-nmstate/nmstate-handler-7wpqr" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.271252 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5fc4e66a-972d-4516-96ba-fa4b56a181a0-ovs-socket\") pod \"nmstate-handler-7wpqr\" (UID: \"5fc4e66a-972d-4516-96ba-fa4b56a181a0\") " pod="openshift-nmstate/nmstate-handler-7wpqr" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.271258 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm9bv\" (UniqueName: \"kubernetes.io/projected/5fc4e66a-972d-4516-96ba-fa4b56a181a0-kube-api-access-xm9bv\") pod \"nmstate-handler-7wpqr\" (UID: \"5fc4e66a-972d-4516-96ba-fa4b56a181a0\") " pod="openshift-nmstate/nmstate-handler-7wpqr" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.271291 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5fc4e66a-972d-4516-96ba-fa4b56a181a0-nmstate-lock\") pod \"nmstate-handler-7wpqr\" (UID: \"5fc4e66a-972d-4516-96ba-fa4b56a181a0\") " pod="openshift-nmstate/nmstate-handler-7wpqr" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.272051 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2b08a312-7e68-44c9-831e-9cc02cb723c1-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-v5s5z\" (UID: \"2b08a312-7e68-44c9-831e-9cc02cb723c1\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-v5s5z" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.277196 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e3dda850-6a0b-454f-aeda-b11f6c5a7604-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-k6nhg\" (UID: \"e3dda850-6a0b-454f-aeda-b11f6c5a7604\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-k6nhg" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.291744 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm9bv\" (UniqueName: \"kubernetes.io/projected/5fc4e66a-972d-4516-96ba-fa4b56a181a0-kube-api-access-xm9bv\") pod \"nmstate-handler-7wpqr\" (UID: \"5fc4e66a-972d-4516-96ba-fa4b56a181a0\") " pod="openshift-nmstate/nmstate-handler-7wpqr" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.292603 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2qz5\" (UniqueName: \"kubernetes.io/projected/2b08a312-7e68-44c9-831e-9cc02cb723c1-kube-api-access-t2qz5\") pod \"nmstate-console-plugin-5dcbbd79cf-v5s5z\" (UID: \"2b08a312-7e68-44c9-831e-9cc02cb723c1\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-v5s5z" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.294962 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28fvz\" (UniqueName: \"kubernetes.io/projected/592dcc36-2b17-4a10-b182-b693490e83c7-kube-api-access-28fvz\") pod \"nmstate-metrics-69594cc75-l2g98\" (UID: \"592dcc36-2b17-4a10-b182-b693490e83c7\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-l2g98" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.315838 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhbnm\" (UniqueName: \"kubernetes.io/projected/e3dda850-6a0b-454f-aeda-b11f6c5a7604-kube-api-access-dhbnm\") pod \"nmstate-webhook-786f45cff4-k6nhg\" (UID: \"e3dda850-6a0b-454f-aeda-b11f6c5a7604\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-k6nhg" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.322125 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536904-mb7cz"] Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.335519 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536904-mb7cz"] Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.349064 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-54b8c44687-g4q25"] Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.349925 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54b8c44687-g4q25" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.363959 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54b8c44687-g4q25"] Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.367038 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-k6nhg" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.372172 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3417b0c6-214e-4d53-8961-aade398d29e1-oauth-serving-cert\") pod \"console-54b8c44687-g4q25\" (UID: \"3417b0c6-214e-4d53-8961-aade398d29e1\") " pod="openshift-console/console-54b8c44687-g4q25" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.372228 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdhlh\" (UniqueName: \"kubernetes.io/projected/3417b0c6-214e-4d53-8961-aade398d29e1-kube-api-access-jdhlh\") pod \"console-54b8c44687-g4q25\" (UID: \"3417b0c6-214e-4d53-8961-aade398d29e1\") " pod="openshift-console/console-54b8c44687-g4q25" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.372254 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3417b0c6-214e-4d53-8961-aade398d29e1-console-oauth-config\") pod \"console-54b8c44687-g4q25\" (UID: \"3417b0c6-214e-4d53-8961-aade398d29e1\") " pod="openshift-console/console-54b8c44687-g4q25" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.372278 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3417b0c6-214e-4d53-8961-aade398d29e1-trusted-ca-bundle\") pod \"console-54b8c44687-g4q25\" (UID: \"3417b0c6-214e-4d53-8961-aade398d29e1\") " pod="openshift-console/console-54b8c44687-g4q25" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.372331 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3417b0c6-214e-4d53-8961-aade398d29e1-console-serving-cert\") pod \"console-54b8c44687-g4q25\" (UID: \"3417b0c6-214e-4d53-8961-aade398d29e1\") " pod="openshift-console/console-54b8c44687-g4q25" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.372406 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3417b0c6-214e-4d53-8961-aade398d29e1-service-ca\") pod \"console-54b8c44687-g4q25\" (UID: \"3417b0c6-214e-4d53-8961-aade398d29e1\") " pod="openshift-console/console-54b8c44687-g4q25" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.372490 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3417b0c6-214e-4d53-8961-aade398d29e1-console-config\") pod \"console-54b8c44687-g4q25\" (UID: \"3417b0c6-214e-4d53-8961-aade398d29e1\") " pod="openshift-console/console-54b8c44687-g4q25" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.378740 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-l2g98" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.427713 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-7wpqr" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.473338 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3417b0c6-214e-4d53-8961-aade398d29e1-oauth-serving-cert\") pod \"console-54b8c44687-g4q25\" (UID: \"3417b0c6-214e-4d53-8961-aade398d29e1\") " pod="openshift-console/console-54b8c44687-g4q25" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.473382 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdhlh\" (UniqueName: \"kubernetes.io/projected/3417b0c6-214e-4d53-8961-aade398d29e1-kube-api-access-jdhlh\") pod \"console-54b8c44687-g4q25\" (UID: \"3417b0c6-214e-4d53-8961-aade398d29e1\") " pod="openshift-console/console-54b8c44687-g4q25" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.473402 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3417b0c6-214e-4d53-8961-aade398d29e1-console-oauth-config\") pod \"console-54b8c44687-g4q25\" (UID: \"3417b0c6-214e-4d53-8961-aade398d29e1\") " pod="openshift-console/console-54b8c44687-g4q25" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.473430 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3417b0c6-214e-4d53-8961-aade398d29e1-trusted-ca-bundle\") pod \"console-54b8c44687-g4q25\" (UID: \"3417b0c6-214e-4d53-8961-aade398d29e1\") " pod="openshift-console/console-54b8c44687-g4q25" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.473456 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3417b0c6-214e-4d53-8961-aade398d29e1-console-serving-cert\") pod \"console-54b8c44687-g4q25\" (UID: \"3417b0c6-214e-4d53-8961-aade398d29e1\") " pod="openshift-console/console-54b8c44687-g4q25" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.473478 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3417b0c6-214e-4d53-8961-aade398d29e1-service-ca\") pod \"console-54b8c44687-g4q25\" (UID: \"3417b0c6-214e-4d53-8961-aade398d29e1\") " pod="openshift-console/console-54b8c44687-g4q25" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.473503 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3417b0c6-214e-4d53-8961-aade398d29e1-console-config\") pod \"console-54b8c44687-g4q25\" (UID: \"3417b0c6-214e-4d53-8961-aade398d29e1\") " pod="openshift-console/console-54b8c44687-g4q25" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.474858 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3417b0c6-214e-4d53-8961-aade398d29e1-service-ca\") pod \"console-54b8c44687-g4q25\" (UID: \"3417b0c6-214e-4d53-8961-aade398d29e1\") " pod="openshift-console/console-54b8c44687-g4q25" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.474937 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3417b0c6-214e-4d53-8961-aade398d29e1-console-config\") pod \"console-54b8c44687-g4q25\" (UID: \"3417b0c6-214e-4d53-8961-aade398d29e1\") " pod="openshift-console/console-54b8c44687-g4q25" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.475610 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3417b0c6-214e-4d53-8961-aade398d29e1-trusted-ca-bundle\") pod \"console-54b8c44687-g4q25\" (UID: \"3417b0c6-214e-4d53-8961-aade398d29e1\") " pod="openshift-console/console-54b8c44687-g4q25" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.476366 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3417b0c6-214e-4d53-8961-aade398d29e1-oauth-serving-cert\") pod \"console-54b8c44687-g4q25\" (UID: \"3417b0c6-214e-4d53-8961-aade398d29e1\") " pod="openshift-console/console-54b8c44687-g4q25" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.487784 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3417b0c6-214e-4d53-8961-aade398d29e1-console-oauth-config\") pod \"console-54b8c44687-g4q25\" (UID: \"3417b0c6-214e-4d53-8961-aade398d29e1\") " pod="openshift-console/console-54b8c44687-g4q25" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.488055 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3417b0c6-214e-4d53-8961-aade398d29e1-console-serving-cert\") pod \"console-54b8c44687-g4q25\" (UID: \"3417b0c6-214e-4d53-8961-aade398d29e1\") " pod="openshift-console/console-54b8c44687-g4q25" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.499985 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdhlh\" (UniqueName: \"kubernetes.io/projected/3417b0c6-214e-4d53-8961-aade398d29e1-kube-api-access-jdhlh\") pod \"console-54b8c44687-g4q25\" (UID: \"3417b0c6-214e-4d53-8961-aade398d29e1\") " pod="openshift-console/console-54b8c44687-g4q25" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.639254 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-l2g98"] Feb 27 17:50:05 crc kubenswrapper[4752]: W0227 17:50:05.648625 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod592dcc36_2b17_4a10_b182_b693490e83c7.slice/crio-4de408f91e01d31a529771f0460f3ee8379cd78b47ea7bf37fc09f48115e31d1 WatchSource:0}: Error finding container 4de408f91e01d31a529771f0460f3ee8379cd78b47ea7bf37fc09f48115e31d1: Status 404 returned error can't find the container with id 4de408f91e01d31a529771f0460f3ee8379cd78b47ea7bf37fc09f48115e31d1 Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.668539 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54b8c44687-g4q25" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.778635 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b08a312-7e68-44c9-831e-9cc02cb723c1-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-v5s5z\" (UID: \"2b08a312-7e68-44c9-831e-9cc02cb723c1\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-v5s5z" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.787376 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b08a312-7e68-44c9-831e-9cc02cb723c1-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-v5s5z\" (UID: \"2b08a312-7e68-44c9-831e-9cc02cb723c1\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-v5s5z" Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.818140 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-k6nhg"] Feb 27 17:50:05 crc kubenswrapper[4752]: W0227 17:50:05.831781 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3dda850_6a0b_454f_aeda_b11f6c5a7604.slice/crio-9f7a5062e7bf02304630ce9c4e40516597dbae0fae515cdbfbdcc893bd8a9f61 WatchSource:0}: Error finding container 9f7a5062e7bf02304630ce9c4e40516597dbae0fae515cdbfbdcc893bd8a9f61: Status 404 returned error can't find the container with id 9f7a5062e7bf02304630ce9c4e40516597dbae0fae515cdbfbdcc893bd8a9f61 Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.971961 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-k6nhg" event={"ID":"e3dda850-6a0b-454f-aeda-b11f6c5a7604","Type":"ContainerStarted","Data":"9f7a5062e7bf02304630ce9c4e40516597dbae0fae515cdbfbdcc893bd8a9f61"} Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.973094 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-l2g98" event={"ID":"592dcc36-2b17-4a10-b182-b693490e83c7","Type":"ContainerStarted","Data":"4de408f91e01d31a529771f0460f3ee8379cd78b47ea7bf37fc09f48115e31d1"} Feb 27 17:50:05 crc kubenswrapper[4752]: I0227 17:50:05.974070 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-7wpqr" event={"ID":"5fc4e66a-972d-4516-96ba-fa4b56a181a0","Type":"ContainerStarted","Data":"78a42623b913556b81165afe9d82c335d909e0e35a480d1ed5081d2bca0d009b"} Feb 27 17:50:06 crc kubenswrapper[4752]: I0227 17:50:06.088661 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-v5s5z" Feb 27 17:50:06 crc kubenswrapper[4752]: I0227 17:50:06.146419 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54b8c44687-g4q25"] Feb 27 17:50:06 crc kubenswrapper[4752]: W0227 17:50:06.149538 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3417b0c6_214e_4d53_8961_aade398d29e1.slice/crio-bd764feb8b2fdbbfd827ab95a9a8fd482e2d490d6fc0467a4a52608bd4786619 WatchSource:0}: Error finding container bd764feb8b2fdbbfd827ab95a9a8fd482e2d490d6fc0467a4a52608bd4786619: Status 404 returned error can't find the container with id bd764feb8b2fdbbfd827ab95a9a8fd482e2d490d6fc0467a4a52608bd4786619 Feb 27 17:50:06 crc kubenswrapper[4752]: I0227 17:50:06.543067 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-v5s5z"] Feb 27 17:50:06 crc kubenswrapper[4752]: W0227 17:50:06.551369 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b08a312_7e68_44c9_831e_9cc02cb723c1.slice/crio-65e4496c98ade6bf1b3c7fb1d09fc46760ed9c2e6a5cfa5a14b4395f35dd03d2 WatchSource:0}: Error finding container 65e4496c98ade6bf1b3c7fb1d09fc46760ed9c2e6a5cfa5a14b4395f35dd03d2: Status 404 returned error can't find the container with id 65e4496c98ade6bf1b3c7fb1d09fc46760ed9c2e6a5cfa5a14b4395f35dd03d2 Feb 27 17:50:06 crc kubenswrapper[4752]: I0227 17:50:06.920660 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a91adcd2-9d66-4213-b9d5-09781e0e3401" path="/var/lib/kubelet/pods/a91adcd2-9d66-4213-b9d5-09781e0e3401/volumes" Feb 27 17:50:06 crc kubenswrapper[4752]: I0227 17:50:06.982044 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-v5s5z" event={"ID":"2b08a312-7e68-44c9-831e-9cc02cb723c1","Type":"ContainerStarted","Data":"65e4496c98ade6bf1b3c7fb1d09fc46760ed9c2e6a5cfa5a14b4395f35dd03d2"} Feb 27 17:50:06 crc kubenswrapper[4752]: I0227 17:50:06.983935 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54b8c44687-g4q25" event={"ID":"3417b0c6-214e-4d53-8961-aade398d29e1","Type":"ContainerStarted","Data":"7bd62f21ae870e70acbd898ec75dcc27513c34c686fade160133d7690ce67d15"} Feb 27 17:50:06 crc kubenswrapper[4752]: I0227 17:50:06.983961 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54b8c44687-g4q25" event={"ID":"3417b0c6-214e-4d53-8961-aade398d29e1","Type":"ContainerStarted","Data":"bd764feb8b2fdbbfd827ab95a9a8fd482e2d490d6fc0467a4a52608bd4786619"} Feb 27 17:50:07 crc kubenswrapper[4752]: I0227 17:50:07.009155 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-54b8c44687-g4q25" podStartSLOduration=2.009126832 podStartE2EDuration="2.009126832s" podCreationTimestamp="2026-02-27 17:50:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-27 17:50:07.008043405 +0000 UTC m=+906.914860266" watchObservedRunningTime="2026-02-27 17:50:07.009126832 +0000 UTC m=+906.915943683" Feb 27 17:50:15 crc kubenswrapper[4752]: I0227 17:50:15.669412 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-54b8c44687-g4q25" Feb 27 17:50:15 crc kubenswrapper[4752]: I0227 17:50:15.670007 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-54b8c44687-g4q25" Feb 27 17:50:15 crc kubenswrapper[4752]: I0227 17:50:15.676876 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-54b8c44687-g4q25" Feb 27 17:50:16 crc kubenswrapper[4752]: I0227 17:50:16.052581 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-54b8c44687-g4q25" Feb 27 17:50:16 crc kubenswrapper[4752]: I0227 17:50:16.139260 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-zj6td"] Feb 27 17:50:41 crc kubenswrapper[4752]: I0227 17:50:41.195068 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-zj6td" podUID="2d1dabd3-4307-468d-86d9-01a1ac2e3539" containerName="console" containerID="cri-o://64a64a525e4c1c18f559a7fddd6b76851042982881904e412a58e19cb5a1329e" gracePeriod=15 Feb 27 17:50:41 crc kubenswrapper[4752]: I0227 17:50:41.635881 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-zj6td_2d1dabd3-4307-468d-86d9-01a1ac2e3539/console/0.log" Feb 27 17:50:41 crc kubenswrapper[4752]: I0227 17:50:41.636312 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-zj6td" Feb 27 17:50:41 crc kubenswrapper[4752]: I0227 17:50:41.786616 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2d1dabd3-4307-468d-86d9-01a1ac2e3539-console-serving-cert\") pod \"2d1dabd3-4307-468d-86d9-01a1ac2e3539\" (UID: \"2d1dabd3-4307-468d-86d9-01a1ac2e3539\") " Feb 27 17:50:41 crc kubenswrapper[4752]: I0227 17:50:41.786728 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d1dabd3-4307-468d-86d9-01a1ac2e3539-trusted-ca-bundle\") pod \"2d1dabd3-4307-468d-86d9-01a1ac2e3539\" (UID: \"2d1dabd3-4307-468d-86d9-01a1ac2e3539\") " Feb 27 17:50:41 crc kubenswrapper[4752]: I0227 17:50:41.786777 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2d1dabd3-4307-468d-86d9-01a1ac2e3539-console-config\") pod \"2d1dabd3-4307-468d-86d9-01a1ac2e3539\" (UID: \"2d1dabd3-4307-468d-86d9-01a1ac2e3539\") " Feb 27 17:50:41 crc kubenswrapper[4752]: I0227 17:50:41.786876 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2g4j\" (UniqueName: \"kubernetes.io/projected/2d1dabd3-4307-468d-86d9-01a1ac2e3539-kube-api-access-f2g4j\") pod \"2d1dabd3-4307-468d-86d9-01a1ac2e3539\" (UID: \"2d1dabd3-4307-468d-86d9-01a1ac2e3539\") " Feb 27 17:50:41 crc kubenswrapper[4752]: I0227 17:50:41.786912 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2d1dabd3-4307-468d-86d9-01a1ac2e3539-service-ca\") pod \"2d1dabd3-4307-468d-86d9-01a1ac2e3539\" (UID: \"2d1dabd3-4307-468d-86d9-01a1ac2e3539\") " Feb 27 17:50:41 crc kubenswrapper[4752]: I0227 17:50:41.786949 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2d1dabd3-4307-468d-86d9-01a1ac2e3539-oauth-serving-cert\") pod \"2d1dabd3-4307-468d-86d9-01a1ac2e3539\" (UID: \"2d1dabd3-4307-468d-86d9-01a1ac2e3539\") " Feb 27 17:50:41 crc kubenswrapper[4752]: I0227 17:50:41.787018 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2d1dabd3-4307-468d-86d9-01a1ac2e3539-console-oauth-config\") pod \"2d1dabd3-4307-468d-86d9-01a1ac2e3539\" (UID: \"2d1dabd3-4307-468d-86d9-01a1ac2e3539\") " Feb 27 17:50:41 crc kubenswrapper[4752]: I0227 17:50:41.787876 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d1dabd3-4307-468d-86d9-01a1ac2e3539-console-config" (OuterVolumeSpecName: "console-config") pod "2d1dabd3-4307-468d-86d9-01a1ac2e3539" (UID: "2d1dabd3-4307-468d-86d9-01a1ac2e3539"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:50:41 crc kubenswrapper[4752]: I0227 17:50:41.787972 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d1dabd3-4307-468d-86d9-01a1ac2e3539-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "2d1dabd3-4307-468d-86d9-01a1ac2e3539" (UID: "2d1dabd3-4307-468d-86d9-01a1ac2e3539"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:50:41 crc kubenswrapper[4752]: I0227 17:50:41.788036 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d1dabd3-4307-468d-86d9-01a1ac2e3539-service-ca" (OuterVolumeSpecName: "service-ca") pod "2d1dabd3-4307-468d-86d9-01a1ac2e3539" (UID: "2d1dabd3-4307-468d-86d9-01a1ac2e3539"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:50:41 crc kubenswrapper[4752]: I0227 17:50:41.788535 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d1dabd3-4307-468d-86d9-01a1ac2e3539-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "2d1dabd3-4307-468d-86d9-01a1ac2e3539" (UID: "2d1dabd3-4307-468d-86d9-01a1ac2e3539"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 17:50:41 crc kubenswrapper[4752]: I0227 17:50:41.794041 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d1dabd3-4307-468d-86d9-01a1ac2e3539-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "2d1dabd3-4307-468d-86d9-01a1ac2e3539" (UID: "2d1dabd3-4307-468d-86d9-01a1ac2e3539"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:50:41 crc kubenswrapper[4752]: I0227 17:50:41.795095 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d1dabd3-4307-468d-86d9-01a1ac2e3539-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "2d1dabd3-4307-468d-86d9-01a1ac2e3539" (UID: "2d1dabd3-4307-468d-86d9-01a1ac2e3539"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 17:50:41 crc kubenswrapper[4752]: I0227 17:50:41.795951 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d1dabd3-4307-468d-86d9-01a1ac2e3539-kube-api-access-f2g4j" (OuterVolumeSpecName: "kube-api-access-f2g4j") pod "2d1dabd3-4307-468d-86d9-01a1ac2e3539" (UID: "2d1dabd3-4307-468d-86d9-01a1ac2e3539"). InnerVolumeSpecName "kube-api-access-f2g4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:50:41 crc kubenswrapper[4752]: I0227 17:50:41.889060 4752 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2d1dabd3-4307-468d-86d9-01a1ac2e3539-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 27 17:50:41 crc kubenswrapper[4752]: I0227 17:50:41.889100 4752 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2d1dabd3-4307-468d-86d9-01a1ac2e3539-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:50:41 crc kubenswrapper[4752]: I0227 17:50:41.889128 4752 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d1dabd3-4307-468d-86d9-01a1ac2e3539-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 17:50:41 crc kubenswrapper[4752]: I0227 17:50:41.889137 4752 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2d1dabd3-4307-468d-86d9-01a1ac2e3539-console-config\") on node \"crc\" DevicePath \"\"" Feb 27 17:50:41 crc kubenswrapper[4752]: I0227 17:50:41.889181 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2g4j\" (UniqueName: \"kubernetes.io/projected/2d1dabd3-4307-468d-86d9-01a1ac2e3539-kube-api-access-f2g4j\") on node \"crc\" DevicePath \"\"" Feb 27 17:50:41 crc kubenswrapper[4752]: I0227 17:50:41.889193 4752 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2d1dabd3-4307-468d-86d9-01a1ac2e3539-service-ca\") on node \"crc\" DevicePath \"\"" Feb 27 17:50:41 crc kubenswrapper[4752]: I0227 17:50:41.889201 4752 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2d1dabd3-4307-468d-86d9-01a1ac2e3539-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 27 17:50:42 crc kubenswrapper[4752]: I0227 17:50:42.245488 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-zj6td_2d1dabd3-4307-468d-86d9-01a1ac2e3539/console/0.log" Feb 27 17:50:42 crc kubenswrapper[4752]: I0227 17:50:42.245968 4752 generic.go:334] "Generic (PLEG): container finished" podID="2d1dabd3-4307-468d-86d9-01a1ac2e3539" containerID="64a64a525e4c1c18f559a7fddd6b76851042982881904e412a58e19cb5a1329e" exitCode=2 Feb 27 17:50:42 crc kubenswrapper[4752]: I0227 17:50:42.246014 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-zj6td" event={"ID":"2d1dabd3-4307-468d-86d9-01a1ac2e3539","Type":"ContainerDied","Data":"64a64a525e4c1c18f559a7fddd6b76851042982881904e412a58e19cb5a1329e"} Feb 27 17:50:42 crc kubenswrapper[4752]: I0227 17:50:42.246062 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-zj6td" event={"ID":"2d1dabd3-4307-468d-86d9-01a1ac2e3539","Type":"ContainerDied","Data":"43f8e47388658cac77045d5b07b88f1a9a72db63db112ac5d99816c4ead09662"} Feb 27 17:50:42 crc kubenswrapper[4752]: I0227 17:50:42.246073 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-zj6td" Feb 27 17:50:42 crc kubenswrapper[4752]: I0227 17:50:42.246090 4752 scope.go:117] "RemoveContainer" containerID="64a64a525e4c1c18f559a7fddd6b76851042982881904e412a58e19cb5a1329e" Feb 27 17:50:42 crc kubenswrapper[4752]: I0227 17:50:42.278547 4752 scope.go:117] "RemoveContainer" containerID="64a64a525e4c1c18f559a7fddd6b76851042982881904e412a58e19cb5a1329e" Feb 27 17:50:42 crc kubenswrapper[4752]: E0227 17:50:42.279241 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64a64a525e4c1c18f559a7fddd6b76851042982881904e412a58e19cb5a1329e\": container with ID starting with 64a64a525e4c1c18f559a7fddd6b76851042982881904e412a58e19cb5a1329e not found: ID does not exist" containerID="64a64a525e4c1c18f559a7fddd6b76851042982881904e412a58e19cb5a1329e" Feb 27 17:50:42 crc kubenswrapper[4752]: I0227 17:50:42.279379 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64a64a525e4c1c18f559a7fddd6b76851042982881904e412a58e19cb5a1329e"} err="failed to get container status \"64a64a525e4c1c18f559a7fddd6b76851042982881904e412a58e19cb5a1329e\": rpc error: code = NotFound desc = could not find container \"64a64a525e4c1c18f559a7fddd6b76851042982881904e412a58e19cb5a1329e\": container with ID starting with 64a64a525e4c1c18f559a7fddd6b76851042982881904e412a58e19cb5a1329e not found: ID does not exist" Feb 27 17:50:42 crc kubenswrapper[4752]: I0227 17:50:42.294916 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-zj6td"] Feb 27 17:50:42 crc kubenswrapper[4752]: I0227 17:50:42.299372 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-zj6td"] Feb 27 17:50:42 crc kubenswrapper[4752]: E0227 17:50:42.337893 4752 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d1dabd3_4307_468d_86d9_01a1ac2e3539.slice/crio-43f8e47388658cac77045d5b07b88f1a9a72db63db112ac5d99816c4ead09662\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d1dabd3_4307_468d_86d9_01a1ac2e3539.slice\": RecentStats: unable to find data in memory cache]" Feb 27 17:50:42 crc kubenswrapper[4752]: I0227 17:50:42.918936 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d1dabd3-4307-468d-86d9-01a1ac2e3539" path="/var/lib/kubelet/pods/2d1dabd3-4307-468d-86d9-01a1ac2e3539/volumes" Feb 27 17:50:45 crc kubenswrapper[4752]: I0227 17:50:45.279184 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-v5s5z" event={"ID":"2b08a312-7e68-44c9-831e-9cc02cb723c1","Type":"ContainerStarted","Data":"666c37475fe083f8814a400397dab5bb421e81d5e41396a093131bef1311b389"} Feb 27 17:50:45 crc kubenswrapper[4752]: I0227 17:50:45.307095 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-v5s5z" podStartSLOduration=2.6749647149999998 podStartE2EDuration="40.307065844s" podCreationTimestamp="2026-02-27 17:50:05 +0000 UTC" firstStartedPulling="2026-02-27 17:50:06.553539501 +0000 UTC m=+906.460356382" lastFinishedPulling="2026-02-27 17:50:44.18564066 +0000 UTC m=+944.092457511" observedRunningTime="2026-02-27 17:50:45.301822414 +0000 UTC m=+945.208639305" watchObservedRunningTime="2026-02-27 17:50:45.307065844 +0000 UTC m=+945.213882725" Feb 27 17:51:07 crc kubenswrapper[4752]: E0227 17:51:07.217900 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256=e7b5bc003ac75bc62f0dcf89e3c906788a034a5a5c28321afa9e3e8773559108/signature-11: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5" Feb 27 17:51:07 crc kubenswrapper[4752]: E0227 17:51:07.218744 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:nmstate-handler,Image:registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5,Command:[manager],Args:[--zap-time-encoding=iso8601],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:COMPONENT,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.labels['app.kubernetes.io/component'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:PART_OF,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.labels['app.kubernetes.io/part-of'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:VERSION,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.labels['app.kubernetes.io/version'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:MANAGED_BY,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.labels['app.kubernetes.io/managed-by'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:OPERATOR_NAME,Value:nmstate,ValueFrom:nil,},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ENABLE_PROFILER,Value:False,ValueFrom:nil,},EnvVar{Name:PROFILER_PORT,Value:6060,ValueFrom:nil,},EnvVar{Name:NMSTATE_INSTANCE_NODE_LOCK_FILE,Value:/var/k8s_nmstate/handler_lock,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{104857600 0} {} 100Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:dbus-socket,ReadOnly:false,MountPath:/run/dbus/system_bus_socket,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:nmstate-lock,ReadOnly:false,MountPath:/var/k8s_nmstate,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovs-socket,ReadOnly:false,MountPath:/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xm9bv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[cat /tmp/healthy],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nmstate-handler-7wpqr_openshift-nmstate(5fc4e66a-972d-4516-96ba-fa4b56a181a0): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256=e7b5bc003ac75bc62f0dcf89e3c906788a034a5a5c28321afa9e3e8773559108/signature-11: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 17:51:07 crc kubenswrapper[4752]: E0227 17:51:07.220141 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nmstate-handler\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256=e7b5bc003ac75bc62f0dcf89e3c906788a034a5a5c28321afa9e3e8773559108/signature-11: status 500 (Internal Server Error)\"" pod="openshift-nmstate/nmstate-handler-7wpqr" podUID="5fc4e66a-972d-4516-96ba-fa4b56a181a0" Feb 27 17:51:07 crc kubenswrapper[4752]: E0227 17:51:07.316068 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256=e7b5bc003ac75bc62f0dcf89e3c906788a034a5a5c28321afa9e3e8773559108/signature-11: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5" Feb 27 17:51:07 crc kubenswrapper[4752]: E0227 17:51:07.316349 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:nmstate-metrics,Image:registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5,Command:[manager],Args:[--zap-time-encoding=iso8601],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:RUN_METRICS_MANAGER,Value:,ValueFrom:nil,},EnvVar{Name:OPERATOR_NAME,Value:nmstate,ValueFrom:nil,},EnvVar{Name:ENABLE_PROFILER,Value:False,ValueFrom:nil,},EnvVar{Name:PROFILER_PORT,Value:6060,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{30 -3} {} 30m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-28fvz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000690000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nmstate-metrics-69594cc75-l2g98_openshift-nmstate(592dcc36-2b17-4a10-b182-b693490e83c7): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256=e7b5bc003ac75bc62f0dcf89e3c906788a034a5a5c28321afa9e3e8773559108/signature-11: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 17:51:07 crc kubenswrapper[4752]: E0227 17:51:07.430356 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nmstate-handler\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5\\\"\"" pod="openshift-nmstate/nmstate-handler-7wpqr" podUID="5fc4e66a-972d-4516-96ba-fa4b56a181a0" Feb 27 17:51:08 crc kubenswrapper[4752]: E0227 17:51:08.981805 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256=e7b5bc003ac75bc62f0dcf89e3c906788a034a5a5c28321afa9e3e8773559108/signature-11: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5" Feb 27 17:51:08 crc kubenswrapper[4752]: E0227 17:51:08.982503 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:nmstate-webhook,Image:registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5,Command:[manager],Args:[--zap-time-encoding=iso8601],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:webhook-server,HostPort:0,ContainerPort:9443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:RUN_WEBHOOK_SERVER,Value:,ValueFrom:nil,},EnvVar{Name:OPERATOR_NAME,Value:nmstate,ValueFrom:nil,},EnvVar{Name:ENABLE_PROFILER,Value:False,ValueFrom:nil,},EnvVar{Name:PROFILER_PORT,Value:6060,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{30 -3} {} 30m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-key-pair,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dhbnm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{1 0 webhook-server},Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{HTTPHeader{Name:Content-Type,Value:application/json,},},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000690000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nmstate-webhook-786f45cff4-k6nhg_openshift-nmstate(e3dda850-6a0b-454f-aeda-b11f6c5a7604): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256=e7b5bc003ac75bc62f0dcf89e3c906788a034a5a5c28321afa9e3e8773559108/signature-11: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 17:51:08 crc kubenswrapper[4752]: E0227 17:51:08.983784 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nmstate-webhook\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256=e7b5bc003ac75bc62f0dcf89e3c906788a034a5a5c28321afa9e3e8773559108/signature-11: status 500 (Internal Server Error)\"" pod="openshift-nmstate/nmstate-webhook-786f45cff4-k6nhg" podUID="e3dda850-6a0b-454f-aeda-b11f6c5a7604" Feb 27 17:51:09 crc kubenswrapper[4752]: E0227 17:51:09.445743 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nmstate-webhook\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5\\\"\"" pod="openshift-nmstate/nmstate-webhook-786f45cff4-k6nhg" podUID="e3dda850-6a0b-454f-aeda-b11f6c5a7604" Feb 27 17:51:20 crc kubenswrapper[4752]: E0227 17:51:20.119509 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256=e7b5bc003ac75bc62f0dcf89e3c906788a034a5a5c28321afa9e3e8773559108/signature-11: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5" Feb 27 17:51:20 crc kubenswrapper[4752]: E0227 17:51:20.120543 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:nmstate-handler,Image:registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5,Command:[manager],Args:[--zap-time-encoding=iso8601],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:COMPONENT,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.labels['app.kubernetes.io/component'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:PART_OF,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.labels['app.kubernetes.io/part-of'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:VERSION,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.labels['app.kubernetes.io/version'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:MANAGED_BY,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.labels['app.kubernetes.io/managed-by'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:OPERATOR_NAME,Value:nmstate,ValueFrom:nil,},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ENABLE_PROFILER,Value:False,ValueFrom:nil,},EnvVar{Name:PROFILER_PORT,Value:6060,ValueFrom:nil,},EnvVar{Name:NMSTATE_INSTANCE_NODE_LOCK_FILE,Value:/var/k8s_nmstate/handler_lock,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{104857600 0} {} 100Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:dbus-socket,ReadOnly:false,MountPath:/run/dbus/system_bus_socket,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:nmstate-lock,ReadOnly:false,MountPath:/var/k8s_nmstate,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovs-socket,ReadOnly:false,MountPath:/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xm9bv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[cat /tmp/healthy],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nmstate-handler-7wpqr_openshift-nmstate(5fc4e66a-972d-4516-96ba-fa4b56a181a0): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256=e7b5bc003ac75bc62f0dcf89e3c906788a034a5a5c28321afa9e3e8773559108/signature-11: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 17:51:20 crc kubenswrapper[4752]: E0227 17:51:20.122130 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nmstate-handler\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256=e7b5bc003ac75bc62f0dcf89e3c906788a034a5a5c28321afa9e3e8773559108/signature-11: status 500 (Internal Server Error)\"" pod="openshift-nmstate/nmstate-handler-7wpqr" podUID="5fc4e66a-972d-4516-96ba-fa4b56a181a0" Feb 27 17:51:21 crc kubenswrapper[4752]: E0227 17:51:21.079655 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256=e7b5bc003ac75bc62f0dcf89e3c906788a034a5a5c28321afa9e3e8773559108/signature-11: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5" Feb 27 17:51:21 crc kubenswrapper[4752]: E0227 17:51:21.080035 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:nmstate-webhook,Image:registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5,Command:[manager],Args:[--zap-time-encoding=iso8601],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:webhook-server,HostPort:0,ContainerPort:9443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:RUN_WEBHOOK_SERVER,Value:,ValueFrom:nil,},EnvVar{Name:OPERATOR_NAME,Value:nmstate,ValueFrom:nil,},EnvVar{Name:ENABLE_PROFILER,Value:False,ValueFrom:nil,},EnvVar{Name:PROFILER_PORT,Value:6060,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{30 -3} {} 30m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-key-pair,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dhbnm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{1 0 webhook-server},Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{HTTPHeader{Name:Content-Type,Value:application/json,},},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000690000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nmstate-webhook-786f45cff4-k6nhg_openshift-nmstate(e3dda850-6a0b-454f-aeda-b11f6c5a7604): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256=e7b5bc003ac75bc62f0dcf89e3c906788a034a5a5c28321afa9e3e8773559108/signature-11: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 17:51:21 crc kubenswrapper[4752]: E0227 17:51:21.081196 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nmstate-webhook\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256=e7b5bc003ac75bc62f0dcf89e3c906788a034a5a5c28321afa9e3e8773559108/signature-11: status 500 (Internal Server Error)\"" pod="openshift-nmstate/nmstate-webhook-786f45cff4-k6nhg" podUID="e3dda850-6a0b-454f-aeda-b11f6c5a7604" Feb 27 17:51:34 crc kubenswrapper[4752]: E0227 17:51:34.910242 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nmstate-handler\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5\\\"\"" pod="openshift-nmstate/nmstate-handler-7wpqr" podUID="5fc4e66a-972d-4516-96ba-fa4b56a181a0" Feb 27 17:51:35 crc kubenswrapper[4752]: E0227 17:51:35.909054 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nmstate-webhook\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5\\\"\"" pod="openshift-nmstate/nmstate-webhook-786f45cff4-k6nhg" podUID="e3dda850-6a0b-454f-aeda-b11f6c5a7604" Feb 27 17:51:47 crc kubenswrapper[4752]: E0227 17:51:47.178230 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256=e7b5bc003ac75bc62f0dcf89e3c906788a034a5a5c28321afa9e3e8773559108/signature-11: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5" Feb 27 17:51:47 crc kubenswrapper[4752]: E0227 17:51:47.178980 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:nmstate-handler,Image:registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5,Command:[manager],Args:[--zap-time-encoding=iso8601],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:COMPONENT,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.labels['app.kubernetes.io/component'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:PART_OF,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.labels['app.kubernetes.io/part-of'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:VERSION,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.labels['app.kubernetes.io/version'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:MANAGED_BY,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.labels['app.kubernetes.io/managed-by'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:OPERATOR_NAME,Value:nmstate,ValueFrom:nil,},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ENABLE_PROFILER,Value:False,ValueFrom:nil,},EnvVar{Name:PROFILER_PORT,Value:6060,ValueFrom:nil,},EnvVar{Name:NMSTATE_INSTANCE_NODE_LOCK_FILE,Value:/var/k8s_nmstate/handler_lock,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{104857600 0} {} 100Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:dbus-socket,ReadOnly:false,MountPath:/run/dbus/system_bus_socket,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:nmstate-lock,ReadOnly:false,MountPath:/var/k8s_nmstate,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovs-socket,ReadOnly:false,MountPath:/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xm9bv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[cat /tmp/healthy],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nmstate-handler-7wpqr_openshift-nmstate(5fc4e66a-972d-4516-96ba-fa4b56a181a0): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256=e7b5bc003ac75bc62f0dcf89e3c906788a034a5a5c28321afa9e3e8773559108/signature-11: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 17:51:47 crc kubenswrapper[4752]: E0227 17:51:47.180421 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nmstate-handler\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256=e7b5bc003ac75bc62f0dcf89e3c906788a034a5a5c28321afa9e3e8773559108/signature-11: status 500 (Internal Server Error)\"" pod="openshift-nmstate/nmstate-handler-7wpqr" podUID="5fc4e66a-972d-4516-96ba-fa4b56a181a0" Feb 27 17:51:49 crc kubenswrapper[4752]: I0227 17:51:49.730000 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-k6nhg" event={"ID":"e3dda850-6a0b-454f-aeda-b11f6c5a7604","Type":"ContainerStarted","Data":"9966c04f00cf4b4373725876007f6916913f4246ee0f60b27a82019833744a17"} Feb 27 17:51:49 crc kubenswrapper[4752]: I0227 17:51:49.730852 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-786f45cff4-k6nhg" Feb 27 17:51:49 crc kubenswrapper[4752]: I0227 17:51:49.750646 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-786f45cff4-k6nhg" podStartSLOduration=2.219409078 podStartE2EDuration="1m45.750620375s" podCreationTimestamp="2026-02-27 17:50:04 +0000 UTC" firstStartedPulling="2026-02-27 17:50:05.834924132 +0000 UTC m=+905.741740983" lastFinishedPulling="2026-02-27 17:51:49.366135419 +0000 UTC m=+1009.272952280" observedRunningTime="2026-02-27 17:51:49.746830811 +0000 UTC m=+1009.653647662" watchObservedRunningTime="2026-02-27 17:51:49.750620375 +0000 UTC m=+1009.657437266" Feb 27 17:51:49 crc kubenswrapper[4752]: I0227 17:51:49.875866 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w589x"] Feb 27 17:51:49 crc kubenswrapper[4752]: E0227 17:51:49.876135 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d1dabd3-4307-468d-86d9-01a1ac2e3539" containerName="console" Feb 27 17:51:49 crc kubenswrapper[4752]: I0227 17:51:49.876170 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d1dabd3-4307-468d-86d9-01a1ac2e3539" containerName="console" Feb 27 17:51:49 crc kubenswrapper[4752]: I0227 17:51:49.876297 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d1dabd3-4307-468d-86d9-01a1ac2e3539" containerName="console" Feb 27 17:51:49 crc kubenswrapper[4752]: I0227 17:51:49.877146 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w589x" Feb 27 17:51:49 crc kubenswrapper[4752]: I0227 17:51:49.898344 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w589x"] Feb 27 17:51:49 crc kubenswrapper[4752]: I0227 17:51:49.914914 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bda19b6-f483-43a8-9d7e-21a8ce77fe9f-utilities\") pod \"certified-operators-w589x\" (UID: \"6bda19b6-f483-43a8-9d7e-21a8ce77fe9f\") " pod="openshift-marketplace/certified-operators-w589x" Feb 27 17:51:49 crc kubenswrapper[4752]: I0227 17:51:49.914990 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bda19b6-f483-43a8-9d7e-21a8ce77fe9f-catalog-content\") pod \"certified-operators-w589x\" (UID: \"6bda19b6-f483-43a8-9d7e-21a8ce77fe9f\") " pod="openshift-marketplace/certified-operators-w589x" Feb 27 17:51:49 crc kubenswrapper[4752]: I0227 17:51:49.915019 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfgt5\" (UniqueName: \"kubernetes.io/projected/6bda19b6-f483-43a8-9d7e-21a8ce77fe9f-kube-api-access-nfgt5\") pod \"certified-operators-w589x\" (UID: \"6bda19b6-f483-43a8-9d7e-21a8ce77fe9f\") " pod="openshift-marketplace/certified-operators-w589x" Feb 27 17:51:50 crc kubenswrapper[4752]: I0227 17:51:50.015783 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfgt5\" (UniqueName: \"kubernetes.io/projected/6bda19b6-f483-43a8-9d7e-21a8ce77fe9f-kube-api-access-nfgt5\") pod \"certified-operators-w589x\" (UID: \"6bda19b6-f483-43a8-9d7e-21a8ce77fe9f\") " pod="openshift-marketplace/certified-operators-w589x" Feb 27 17:51:50 crc kubenswrapper[4752]: I0227 17:51:50.015897 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bda19b6-f483-43a8-9d7e-21a8ce77fe9f-utilities\") pod \"certified-operators-w589x\" (UID: \"6bda19b6-f483-43a8-9d7e-21a8ce77fe9f\") " pod="openshift-marketplace/certified-operators-w589x" Feb 27 17:51:50 crc kubenswrapper[4752]: I0227 17:51:50.015940 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bda19b6-f483-43a8-9d7e-21a8ce77fe9f-catalog-content\") pod \"certified-operators-w589x\" (UID: \"6bda19b6-f483-43a8-9d7e-21a8ce77fe9f\") " pod="openshift-marketplace/certified-operators-w589x" Feb 27 17:51:50 crc kubenswrapper[4752]: I0227 17:51:50.016452 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bda19b6-f483-43a8-9d7e-21a8ce77fe9f-catalog-content\") pod \"certified-operators-w589x\" (UID: \"6bda19b6-f483-43a8-9d7e-21a8ce77fe9f\") " pod="openshift-marketplace/certified-operators-w589x" Feb 27 17:51:50 crc kubenswrapper[4752]: I0227 17:51:50.017070 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bda19b6-f483-43a8-9d7e-21a8ce77fe9f-utilities\") pod \"certified-operators-w589x\" (UID: \"6bda19b6-f483-43a8-9d7e-21a8ce77fe9f\") " pod="openshift-marketplace/certified-operators-w589x" Feb 27 17:51:50 crc kubenswrapper[4752]: I0227 17:51:50.040061 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfgt5\" (UniqueName: \"kubernetes.io/projected/6bda19b6-f483-43a8-9d7e-21a8ce77fe9f-kube-api-access-nfgt5\") pod \"certified-operators-w589x\" (UID: \"6bda19b6-f483-43a8-9d7e-21a8ce77fe9f\") " pod="openshift-marketplace/certified-operators-w589x" Feb 27 17:51:50 crc kubenswrapper[4752]: I0227 17:51:50.206763 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w589x" Feb 27 17:51:50 crc kubenswrapper[4752]: I0227 17:51:50.491378 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w589x"] Feb 27 17:51:50 crc kubenswrapper[4752]: I0227 17:51:50.736757 4752 generic.go:334] "Generic (PLEG): container finished" podID="6bda19b6-f483-43a8-9d7e-21a8ce77fe9f" containerID="c6805241c6b1dba2569632934b5450cbe44c31f4e0af864f9132021dbfe27c51" exitCode=0 Feb 27 17:51:50 crc kubenswrapper[4752]: I0227 17:51:50.736795 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w589x" event={"ID":"6bda19b6-f483-43a8-9d7e-21a8ce77fe9f","Type":"ContainerDied","Data":"c6805241c6b1dba2569632934b5450cbe44c31f4e0af864f9132021dbfe27c51"} Feb 27 17:51:50 crc kubenswrapper[4752]: I0227 17:51:50.736832 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w589x" event={"ID":"6bda19b6-f483-43a8-9d7e-21a8ce77fe9f","Type":"ContainerStarted","Data":"d51c5fd70b5622abed844d97f4227ac2828f3a893c630a96d04135d01066f4bb"} Feb 27 17:51:51 crc kubenswrapper[4752]: I0227 17:51:51.748217 4752 generic.go:334] "Generic (PLEG): container finished" podID="6bda19b6-f483-43a8-9d7e-21a8ce77fe9f" containerID="b9b0fc1e466db4b9af1d725a959e7ea73833ee72a3d2996f97b14805e10d99ce" exitCode=0 Feb 27 17:51:51 crc kubenswrapper[4752]: I0227 17:51:51.748392 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w589x" event={"ID":"6bda19b6-f483-43a8-9d7e-21a8ce77fe9f","Type":"ContainerDied","Data":"b9b0fc1e466db4b9af1d725a959e7ea73833ee72a3d2996f97b14805e10d99ce"} Feb 27 17:51:52 crc kubenswrapper[4752]: I0227 17:51:52.760248 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w589x" event={"ID":"6bda19b6-f483-43a8-9d7e-21a8ce77fe9f","Type":"ContainerStarted","Data":"680ea40251152734c0d031b70c90fc7dfcc639a2f3597fa3319a81537096871b"} Feb 27 17:51:52 crc kubenswrapper[4752]: I0227 17:51:52.792212 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w589x" podStartSLOduration=2.382153646 podStartE2EDuration="3.792186297s" podCreationTimestamp="2026-02-27 17:51:49 +0000 UTC" firstStartedPulling="2026-02-27 17:51:50.738539755 +0000 UTC m=+1010.645356606" lastFinishedPulling="2026-02-27 17:51:52.148572356 +0000 UTC m=+1012.055389257" observedRunningTime="2026-02-27 17:51:52.791753166 +0000 UTC m=+1012.698570027" watchObservedRunningTime="2026-02-27 17:51:52.792186297 +0000 UTC m=+1012.699003188" Feb 27 17:51:58 crc kubenswrapper[4752]: I0227 17:51:58.441659 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tvqkn"] Feb 27 17:51:58 crc kubenswrapper[4752]: I0227 17:51:58.443950 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tvqkn" Feb 27 17:51:58 crc kubenswrapper[4752]: I0227 17:51:58.457384 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tvqkn"] Feb 27 17:51:58 crc kubenswrapper[4752]: I0227 17:51:58.530946 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acea6aaf-6bcb-4b61-9297-90d2784b2505-catalog-content\") pod \"community-operators-tvqkn\" (UID: \"acea6aaf-6bcb-4b61-9297-90d2784b2505\") " pod="openshift-marketplace/community-operators-tvqkn" Feb 27 17:51:58 crc kubenswrapper[4752]: I0227 17:51:58.531229 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72vpq\" (UniqueName: \"kubernetes.io/projected/acea6aaf-6bcb-4b61-9297-90d2784b2505-kube-api-access-72vpq\") pod \"community-operators-tvqkn\" (UID: \"acea6aaf-6bcb-4b61-9297-90d2784b2505\") " pod="openshift-marketplace/community-operators-tvqkn" Feb 27 17:51:58 crc kubenswrapper[4752]: I0227 17:51:58.531409 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acea6aaf-6bcb-4b61-9297-90d2784b2505-utilities\") pod \"community-operators-tvqkn\" (UID: \"acea6aaf-6bcb-4b61-9297-90d2784b2505\") " pod="openshift-marketplace/community-operators-tvqkn" Feb 27 17:51:58 crc kubenswrapper[4752]: I0227 17:51:58.633416 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acea6aaf-6bcb-4b61-9297-90d2784b2505-catalog-content\") pod \"community-operators-tvqkn\" (UID: \"acea6aaf-6bcb-4b61-9297-90d2784b2505\") " pod="openshift-marketplace/community-operators-tvqkn" Feb 27 17:51:58 crc kubenswrapper[4752]: I0227 17:51:58.633552 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72vpq\" (UniqueName: \"kubernetes.io/projected/acea6aaf-6bcb-4b61-9297-90d2784b2505-kube-api-access-72vpq\") pod \"community-operators-tvqkn\" (UID: \"acea6aaf-6bcb-4b61-9297-90d2784b2505\") " pod="openshift-marketplace/community-operators-tvqkn" Feb 27 17:51:58 crc kubenswrapper[4752]: I0227 17:51:58.633660 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acea6aaf-6bcb-4b61-9297-90d2784b2505-utilities\") pod \"community-operators-tvqkn\" (UID: \"acea6aaf-6bcb-4b61-9297-90d2784b2505\") " pod="openshift-marketplace/community-operators-tvqkn" Feb 27 17:51:58 crc kubenswrapper[4752]: I0227 17:51:58.634613 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acea6aaf-6bcb-4b61-9297-90d2784b2505-catalog-content\") pod \"community-operators-tvqkn\" (UID: \"acea6aaf-6bcb-4b61-9297-90d2784b2505\") " pod="openshift-marketplace/community-operators-tvqkn" Feb 27 17:51:58 crc kubenswrapper[4752]: I0227 17:51:58.634698 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acea6aaf-6bcb-4b61-9297-90d2784b2505-utilities\") pod \"community-operators-tvqkn\" (UID: \"acea6aaf-6bcb-4b61-9297-90d2784b2505\") " pod="openshift-marketplace/community-operators-tvqkn" Feb 27 17:51:58 crc kubenswrapper[4752]: I0227 17:51:58.662909 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72vpq\" (UniqueName: \"kubernetes.io/projected/acea6aaf-6bcb-4b61-9297-90d2784b2505-kube-api-access-72vpq\") pod \"community-operators-tvqkn\" (UID: \"acea6aaf-6bcb-4b61-9297-90d2784b2505\") " pod="openshift-marketplace/community-operators-tvqkn" Feb 27 17:51:58 crc kubenswrapper[4752]: I0227 17:51:58.767319 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tvqkn" Feb 27 17:51:58 crc kubenswrapper[4752]: E0227 17:51:58.912448 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nmstate-handler\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5\\\"\"" pod="openshift-nmstate/nmstate-handler-7wpqr" podUID="5fc4e66a-972d-4516-96ba-fa4b56a181a0" Feb 27 17:51:59 crc kubenswrapper[4752]: I0227 17:51:59.237828 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tvqkn"] Feb 27 17:51:59 crc kubenswrapper[4752]: W0227 17:51:59.243659 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacea6aaf_6bcb_4b61_9297_90d2784b2505.slice/crio-01132c98115aab456f460799b30d70eb612a0c1409091e9f3ad10b84c970830f WatchSource:0}: Error finding container 01132c98115aab456f460799b30d70eb612a0c1409091e9f3ad10b84c970830f: Status 404 returned error can't find the container with id 01132c98115aab456f460799b30d70eb612a0c1409091e9f3ad10b84c970830f Feb 27 17:51:59 crc kubenswrapper[4752]: I0227 17:51:59.806591 4752 generic.go:334] "Generic (PLEG): container finished" podID="acea6aaf-6bcb-4b61-9297-90d2784b2505" containerID="58c95cfa681644d9144493ecd5b0247787e27d79f35c8f14cb7b397061fe2ff4" exitCode=0 Feb 27 17:51:59 crc kubenswrapper[4752]: I0227 17:51:59.806650 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tvqkn" event={"ID":"acea6aaf-6bcb-4b61-9297-90d2784b2505","Type":"ContainerDied","Data":"58c95cfa681644d9144493ecd5b0247787e27d79f35c8f14cb7b397061fe2ff4"} Feb 27 17:51:59 crc kubenswrapper[4752]: I0227 17:51:59.806687 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tvqkn" event={"ID":"acea6aaf-6bcb-4b61-9297-90d2784b2505","Type":"ContainerStarted","Data":"01132c98115aab456f460799b30d70eb612a0c1409091e9f3ad10b84c970830f"} Feb 27 17:52:00 crc kubenswrapper[4752]: I0227 17:52:00.143511 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536912-zr29z"] Feb 27 17:52:00 crc kubenswrapper[4752]: I0227 17:52:00.145220 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536912-zr29z" Feb 27 17:52:00 crc kubenswrapper[4752]: I0227 17:52:00.148759 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 17:52:00 crc kubenswrapper[4752]: I0227 17:52:00.148780 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wtt6d" Feb 27 17:52:00 crc kubenswrapper[4752]: I0227 17:52:00.150671 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 17:52:00 crc kubenswrapper[4752]: I0227 17:52:00.158911 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536912-zr29z"] Feb 27 17:52:00 crc kubenswrapper[4752]: I0227 17:52:00.207715 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w589x" Feb 27 17:52:00 crc kubenswrapper[4752]: I0227 17:52:00.207957 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w589x" Feb 27 17:52:00 crc kubenswrapper[4752]: I0227 17:52:00.254610 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w589x" Feb 27 17:52:00 crc kubenswrapper[4752]: I0227 17:52:00.268420 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hpx2\" (UniqueName: \"kubernetes.io/projected/b753d203-d3cb-4304-b753-4ec10344426b-kube-api-access-9hpx2\") pod \"auto-csr-approver-29536912-zr29z\" (UID: \"b753d203-d3cb-4304-b753-4ec10344426b\") " pod="openshift-infra/auto-csr-approver-29536912-zr29z" Feb 27 17:52:00 crc kubenswrapper[4752]: I0227 17:52:00.370104 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hpx2\" (UniqueName: \"kubernetes.io/projected/b753d203-d3cb-4304-b753-4ec10344426b-kube-api-access-9hpx2\") pod \"auto-csr-approver-29536912-zr29z\" (UID: \"b753d203-d3cb-4304-b753-4ec10344426b\") " pod="openshift-infra/auto-csr-approver-29536912-zr29z" Feb 27 17:52:00 crc kubenswrapper[4752]: I0227 17:52:00.402604 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hpx2\" (UniqueName: \"kubernetes.io/projected/b753d203-d3cb-4304-b753-4ec10344426b-kube-api-access-9hpx2\") pod \"auto-csr-approver-29536912-zr29z\" (UID: \"b753d203-d3cb-4304-b753-4ec10344426b\") " pod="openshift-infra/auto-csr-approver-29536912-zr29z" Feb 27 17:52:00 crc kubenswrapper[4752]: I0227 17:52:00.466474 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536912-zr29z" Feb 27 17:52:00 crc kubenswrapper[4752]: I0227 17:52:00.699532 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536912-zr29z"] Feb 27 17:52:00 crc kubenswrapper[4752]: E0227 17:52:00.775374 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/community-operator-index@sha256=886ecdbcb5b8f90338063f6476072fab73c2a9a65b9f2b3835b7bd01c69794c1/signature-2: status 500 (Internal Server Error)" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 27 17:52:00 crc kubenswrapper[4752]: E0227 17:52:00.775534 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-72vpq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-tvqkn_openshift-marketplace(acea6aaf-6bcb-4b61-9297-90d2784b2505): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/community-operator-index@sha256=886ecdbcb5b8f90338063f6476072fab73c2a9a65b9f2b3835b7bd01c69794c1/signature-2: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 17:52:00 crc kubenswrapper[4752]: E0227 17:52:00.776654 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/community-operator-index@sha256=886ecdbcb5b8f90338063f6476072fab73c2a9a65b9f2b3835b7bd01c69794c1/signature-2: status 500 (Internal Server Error)\"" pod="openshift-marketplace/community-operators-tvqkn" podUID="acea6aaf-6bcb-4b61-9297-90d2784b2505" Feb 27 17:52:00 crc kubenswrapper[4752]: I0227 17:52:00.822466 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536912-zr29z" event={"ID":"b753d203-d3cb-4304-b753-4ec10344426b","Type":"ContainerStarted","Data":"ad8abd137022917680a8e623fdd0571d5f54377f5dcac67f59da72f63834351a"} Feb 27 17:52:00 crc kubenswrapper[4752]: E0227 17:52:00.824258 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-tvqkn" podUID="acea6aaf-6bcb-4b61-9297-90d2784b2505" Feb 27 17:52:00 crc kubenswrapper[4752]: I0227 17:52:00.885984 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w589x" Feb 27 17:52:02 crc kubenswrapper[4752]: I0227 17:52:02.624500 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w589x"] Feb 27 17:52:02 crc kubenswrapper[4752]: I0227 17:52:02.845293 4752 generic.go:334] "Generic (PLEG): container finished" podID="b753d203-d3cb-4304-b753-4ec10344426b" containerID="3ddad92e6bc219d176dca73e2ebedc05464e9d25492781347bddf6b670a59fcf" exitCode=0 Feb 27 17:52:02 crc kubenswrapper[4752]: I0227 17:52:02.845595 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536912-zr29z" event={"ID":"b753d203-d3cb-4304-b753-4ec10344426b","Type":"ContainerDied","Data":"3ddad92e6bc219d176dca73e2ebedc05464e9d25492781347bddf6b670a59fcf"} Feb 27 17:52:03 crc kubenswrapper[4752]: I0227 17:52:03.068942 4752 scope.go:117] "RemoveContainer" containerID="2fc07b2dafe4b0a53be5ef4318b5c67a85c4e8d2653bc57970bcfb4d0a6f3496" Feb 27 17:52:03 crc kubenswrapper[4752]: I0227 17:52:03.853016 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-w589x" podUID="6bda19b6-f483-43a8-9d7e-21a8ce77fe9f" containerName="registry-server" containerID="cri-o://680ea40251152734c0d031b70c90fc7dfcc639a2f3597fa3319a81537096871b" gracePeriod=2 Feb 27 17:52:04 crc kubenswrapper[4752]: I0227 17:52:04.135897 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536912-zr29z" Feb 27 17:52:04 crc kubenswrapper[4752]: I0227 17:52:04.226623 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hpx2\" (UniqueName: \"kubernetes.io/projected/b753d203-d3cb-4304-b753-4ec10344426b-kube-api-access-9hpx2\") pod \"b753d203-d3cb-4304-b753-4ec10344426b\" (UID: \"b753d203-d3cb-4304-b753-4ec10344426b\") " Feb 27 17:52:04 crc kubenswrapper[4752]: I0227 17:52:04.231273 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b753d203-d3cb-4304-b753-4ec10344426b-kube-api-access-9hpx2" (OuterVolumeSpecName: "kube-api-access-9hpx2") pod "b753d203-d3cb-4304-b753-4ec10344426b" (UID: "b753d203-d3cb-4304-b753-4ec10344426b"). InnerVolumeSpecName "kube-api-access-9hpx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:52:04 crc kubenswrapper[4752]: I0227 17:52:04.269420 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w589x" Feb 27 17:52:04 crc kubenswrapper[4752]: I0227 17:52:04.327560 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bda19b6-f483-43a8-9d7e-21a8ce77fe9f-utilities\") pod \"6bda19b6-f483-43a8-9d7e-21a8ce77fe9f\" (UID: \"6bda19b6-f483-43a8-9d7e-21a8ce77fe9f\") " Feb 27 17:52:04 crc kubenswrapper[4752]: I0227 17:52:04.327656 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bda19b6-f483-43a8-9d7e-21a8ce77fe9f-catalog-content\") pod \"6bda19b6-f483-43a8-9d7e-21a8ce77fe9f\" (UID: \"6bda19b6-f483-43a8-9d7e-21a8ce77fe9f\") " Feb 27 17:52:04 crc kubenswrapper[4752]: I0227 17:52:04.327680 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfgt5\" (UniqueName: \"kubernetes.io/projected/6bda19b6-f483-43a8-9d7e-21a8ce77fe9f-kube-api-access-nfgt5\") pod \"6bda19b6-f483-43a8-9d7e-21a8ce77fe9f\" (UID: \"6bda19b6-f483-43a8-9d7e-21a8ce77fe9f\") " Feb 27 17:52:04 crc kubenswrapper[4752]: I0227 17:52:04.328343 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hpx2\" (UniqueName: \"kubernetes.io/projected/b753d203-d3cb-4304-b753-4ec10344426b-kube-api-access-9hpx2\") on node \"crc\" DevicePath \"\"" Feb 27 17:52:04 crc kubenswrapper[4752]: I0227 17:52:04.328734 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bda19b6-f483-43a8-9d7e-21a8ce77fe9f-utilities" (OuterVolumeSpecName: "utilities") pod "6bda19b6-f483-43a8-9d7e-21a8ce77fe9f" (UID: "6bda19b6-f483-43a8-9d7e-21a8ce77fe9f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 17:52:04 crc kubenswrapper[4752]: I0227 17:52:04.331480 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bda19b6-f483-43a8-9d7e-21a8ce77fe9f-kube-api-access-nfgt5" (OuterVolumeSpecName: "kube-api-access-nfgt5") pod "6bda19b6-f483-43a8-9d7e-21a8ce77fe9f" (UID: "6bda19b6-f483-43a8-9d7e-21a8ce77fe9f"). InnerVolumeSpecName "kube-api-access-nfgt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:52:04 crc kubenswrapper[4752]: I0227 17:52:04.376508 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bda19b6-f483-43a8-9d7e-21a8ce77fe9f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6bda19b6-f483-43a8-9d7e-21a8ce77fe9f" (UID: "6bda19b6-f483-43a8-9d7e-21a8ce77fe9f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 17:52:04 crc kubenswrapper[4752]: I0227 17:52:04.429388 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bda19b6-f483-43a8-9d7e-21a8ce77fe9f-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 17:52:04 crc kubenswrapper[4752]: I0227 17:52:04.429422 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bda19b6-f483-43a8-9d7e-21a8ce77fe9f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 17:52:04 crc kubenswrapper[4752]: I0227 17:52:04.429433 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfgt5\" (UniqueName: \"kubernetes.io/projected/6bda19b6-f483-43a8-9d7e-21a8ce77fe9f-kube-api-access-nfgt5\") on node \"crc\" DevicePath \"\"" Feb 27 17:52:04 crc kubenswrapper[4752]: I0227 17:52:04.865032 4752 generic.go:334] "Generic (PLEG): container finished" podID="6bda19b6-f483-43a8-9d7e-21a8ce77fe9f" containerID="680ea40251152734c0d031b70c90fc7dfcc639a2f3597fa3319a81537096871b" exitCode=0 Feb 27 17:52:04 crc kubenswrapper[4752]: I0227 17:52:04.865175 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w589x" event={"ID":"6bda19b6-f483-43a8-9d7e-21a8ce77fe9f","Type":"ContainerDied","Data":"680ea40251152734c0d031b70c90fc7dfcc639a2f3597fa3319a81537096871b"} Feb 27 17:52:04 crc kubenswrapper[4752]: I0227 17:52:04.865239 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w589x" Feb 27 17:52:04 crc kubenswrapper[4752]: I0227 17:52:04.865262 4752 scope.go:117] "RemoveContainer" containerID="680ea40251152734c0d031b70c90fc7dfcc639a2f3597fa3319a81537096871b" Feb 27 17:52:04 crc kubenswrapper[4752]: I0227 17:52:04.865243 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w589x" event={"ID":"6bda19b6-f483-43a8-9d7e-21a8ce77fe9f","Type":"ContainerDied","Data":"d51c5fd70b5622abed844d97f4227ac2828f3a893c630a96d04135d01066f4bb"} Feb 27 17:52:04 crc kubenswrapper[4752]: I0227 17:52:04.868042 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536912-zr29z" event={"ID":"b753d203-d3cb-4304-b753-4ec10344426b","Type":"ContainerDied","Data":"ad8abd137022917680a8e623fdd0571d5f54377f5dcac67f59da72f63834351a"} Feb 27 17:52:04 crc kubenswrapper[4752]: I0227 17:52:04.868081 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad8abd137022917680a8e623fdd0571d5f54377f5dcac67f59da72f63834351a" Feb 27 17:52:04 crc kubenswrapper[4752]: I0227 17:52:04.868183 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536912-zr29z" Feb 27 17:52:04 crc kubenswrapper[4752]: I0227 17:52:04.895394 4752 scope.go:117] "RemoveContainer" containerID="b9b0fc1e466db4b9af1d725a959e7ea73833ee72a3d2996f97b14805e10d99ce" Feb 27 17:52:04 crc kubenswrapper[4752]: I0227 17:52:04.921900 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w589x"] Feb 27 17:52:04 crc kubenswrapper[4752]: I0227 17:52:04.928048 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-w589x"] Feb 27 17:52:04 crc kubenswrapper[4752]: I0227 17:52:04.940662 4752 scope.go:117] "RemoveContainer" containerID="c6805241c6b1dba2569632934b5450cbe44c31f4e0af864f9132021dbfe27c51" Feb 27 17:52:04 crc kubenswrapper[4752]: I0227 17:52:04.963971 4752 scope.go:117] "RemoveContainer" containerID="680ea40251152734c0d031b70c90fc7dfcc639a2f3597fa3319a81537096871b" Feb 27 17:52:04 crc kubenswrapper[4752]: E0227 17:52:04.964522 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"680ea40251152734c0d031b70c90fc7dfcc639a2f3597fa3319a81537096871b\": container with ID starting with 680ea40251152734c0d031b70c90fc7dfcc639a2f3597fa3319a81537096871b not found: ID does not exist" containerID="680ea40251152734c0d031b70c90fc7dfcc639a2f3597fa3319a81537096871b" Feb 27 17:52:04 crc kubenswrapper[4752]: I0227 17:52:04.964576 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"680ea40251152734c0d031b70c90fc7dfcc639a2f3597fa3319a81537096871b"} err="failed to get container status \"680ea40251152734c0d031b70c90fc7dfcc639a2f3597fa3319a81537096871b\": rpc error: code = NotFound desc = could not find container \"680ea40251152734c0d031b70c90fc7dfcc639a2f3597fa3319a81537096871b\": container with ID starting with 680ea40251152734c0d031b70c90fc7dfcc639a2f3597fa3319a81537096871b not found: ID does not exist" Feb 27 17:52:04 crc kubenswrapper[4752]: I0227 17:52:04.964614 4752 scope.go:117] "RemoveContainer" containerID="b9b0fc1e466db4b9af1d725a959e7ea73833ee72a3d2996f97b14805e10d99ce" Feb 27 17:52:04 crc kubenswrapper[4752]: E0227 17:52:04.965070 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9b0fc1e466db4b9af1d725a959e7ea73833ee72a3d2996f97b14805e10d99ce\": container with ID starting with b9b0fc1e466db4b9af1d725a959e7ea73833ee72a3d2996f97b14805e10d99ce not found: ID does not exist" containerID="b9b0fc1e466db4b9af1d725a959e7ea73833ee72a3d2996f97b14805e10d99ce" Feb 27 17:52:04 crc kubenswrapper[4752]: I0227 17:52:04.965198 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9b0fc1e466db4b9af1d725a959e7ea73833ee72a3d2996f97b14805e10d99ce"} err="failed to get container status \"b9b0fc1e466db4b9af1d725a959e7ea73833ee72a3d2996f97b14805e10d99ce\": rpc error: code = NotFound desc = could not find container \"b9b0fc1e466db4b9af1d725a959e7ea73833ee72a3d2996f97b14805e10d99ce\": container with ID starting with b9b0fc1e466db4b9af1d725a959e7ea73833ee72a3d2996f97b14805e10d99ce not found: ID does not exist" Feb 27 17:52:04 crc kubenswrapper[4752]: I0227 17:52:04.965227 4752 scope.go:117] "RemoveContainer" containerID="c6805241c6b1dba2569632934b5450cbe44c31f4e0af864f9132021dbfe27c51" Feb 27 17:52:04 crc kubenswrapper[4752]: E0227 17:52:04.965579 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6805241c6b1dba2569632934b5450cbe44c31f4e0af864f9132021dbfe27c51\": container with ID starting with c6805241c6b1dba2569632934b5450cbe44c31f4e0af864f9132021dbfe27c51 not found: ID does not exist" containerID="c6805241c6b1dba2569632934b5450cbe44c31f4e0af864f9132021dbfe27c51" Feb 27 17:52:04 crc kubenswrapper[4752]: I0227 17:52:04.965625 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6805241c6b1dba2569632934b5450cbe44c31f4e0af864f9132021dbfe27c51"} err="failed to get container status \"c6805241c6b1dba2569632934b5450cbe44c31f4e0af864f9132021dbfe27c51\": rpc error: code = NotFound desc = could not find container \"c6805241c6b1dba2569632934b5450cbe44c31f4e0af864f9132021dbfe27c51\": container with ID starting with c6805241c6b1dba2569632934b5450cbe44c31f4e0af864f9132021dbfe27c51 not found: ID does not exist" Feb 27 17:52:05 crc kubenswrapper[4752]: I0227 17:52:05.209109 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536906-xdhv9"] Feb 27 17:52:05 crc kubenswrapper[4752]: I0227 17:52:05.212766 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536906-xdhv9"] Feb 27 17:52:05 crc kubenswrapper[4752]: I0227 17:52:05.374055 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-786f45cff4-k6nhg" Feb 27 17:52:06 crc kubenswrapper[4752]: I0227 17:52:06.324646 4752 patch_prober.go:28] interesting pod/machine-config-daemon-cm8wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 17:52:06 crc kubenswrapper[4752]: I0227 17:52:06.324741 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 17:52:06 crc kubenswrapper[4752]: I0227 17:52:06.922731 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bda19b6-f483-43a8-9d7e-21a8ce77fe9f" path="/var/lib/kubelet/pods/6bda19b6-f483-43a8-9d7e-21a8ce77fe9f/volumes" Feb 27 17:52:06 crc kubenswrapper[4752]: I0227 17:52:06.924485 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90b0bdc4-d072-4dff-ab6a-c4b02431d46c" path="/var/lib/kubelet/pods/90b0bdc4-d072-4dff-ab6a-c4b02431d46c/volumes" Feb 27 17:52:09 crc kubenswrapper[4752]: E0227 17:52:09.123405 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-kube-rbac-proxy-rhel9@sha256=8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501/signature-11: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-kube-rbac-proxy-rhel9@sha256:787be45b5241419b6819676d43325a9030c0e16441918e4a33a44f0380d6b902" Feb 27 17:52:09 crc kubenswrapper[4752]: E0227 17:52:09.124344 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:registry.redhat.io/openshift4/ose-kube-rbac-proxy-rhel9@sha256:787be45b5241419b6819676d43325a9030c0e16441918e4a33a44f0380d6b902,Command:[],Args:[--logtostderr --secure-listen-address=:8443 --upstream=http://127.0.0.1:8089],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-28fvz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000690000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nmstate-metrics-69594cc75-l2g98_openshift-nmstate(592dcc36-2b17-4a10-b182-b693490e83c7): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-kube-rbac-proxy-rhel9@sha256=8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501/signature-11: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 17:52:09 crc kubenswrapper[4752]: E0227 17:52:09.125706 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"nmstate-metrics\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256=e7b5bc003ac75bc62f0dcf89e3c906788a034a5a5c28321afa9e3e8773559108/signature-11: status 500 (Internal Server Error)\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-kube-rbac-proxy-rhel9@sha256=8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501/signature-11: status 500 (Internal Server Error)\"]" pod="openshift-nmstate/nmstate-metrics-69594cc75-l2g98" podUID="592dcc36-2b17-4a10-b182-b693490e83c7" Feb 27 17:52:13 crc kubenswrapper[4752]: E0227 17:52:13.912257 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nmstate-handler\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5\\\"\"" pod="openshift-nmstate/nmstate-handler-7wpqr" podUID="5fc4e66a-972d-4516-96ba-fa4b56a181a0" Feb 27 17:52:14 crc kubenswrapper[4752]: I0227 17:52:14.948403 4752 generic.go:334] "Generic (PLEG): container finished" podID="acea6aaf-6bcb-4b61-9297-90d2784b2505" containerID="598c0fa28cdc93e5d142497dbf1473bdab1d81c00d55c2654fd23df36c490680" exitCode=0 Feb 27 17:52:14 crc kubenswrapper[4752]: I0227 17:52:14.948513 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tvqkn" event={"ID":"acea6aaf-6bcb-4b61-9297-90d2784b2505","Type":"ContainerDied","Data":"598c0fa28cdc93e5d142497dbf1473bdab1d81c00d55c2654fd23df36c490680"} Feb 27 17:52:15 crc kubenswrapper[4752]: I0227 17:52:15.962140 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tvqkn" event={"ID":"acea6aaf-6bcb-4b61-9297-90d2784b2505","Type":"ContainerStarted","Data":"c470e6f44294089bbb631c430d1cc0e299d5b9dae1c99c758f0df63b02dc3d52"} Feb 27 17:52:15 crc kubenswrapper[4752]: I0227 17:52:15.993180 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tvqkn" podStartSLOduration=2.203329114 podStartE2EDuration="17.993131972s" podCreationTimestamp="2026-02-27 17:51:58 +0000 UTC" firstStartedPulling="2026-02-27 17:51:59.808778567 +0000 UTC m=+1019.715595458" lastFinishedPulling="2026-02-27 17:52:15.598581425 +0000 UTC m=+1035.505398316" observedRunningTime="2026-02-27 17:52:15.98982762 +0000 UTC m=+1035.896644511" watchObservedRunningTime="2026-02-27 17:52:15.993131972 +0000 UTC m=+1035.899948853" Feb 27 17:52:18 crc kubenswrapper[4752]: I0227 17:52:18.768597 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tvqkn" Feb 27 17:52:18 crc kubenswrapper[4752]: I0227 17:52:18.769417 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tvqkn" Feb 27 17:52:18 crc kubenswrapper[4752]: I0227 17:52:18.843466 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tvqkn" Feb 27 17:52:28 crc kubenswrapper[4752]: I0227 17:52:28.850763 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tvqkn" Feb 27 17:52:28 crc kubenswrapper[4752]: I0227 17:52:28.939771 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tvqkn"] Feb 27 17:52:29 crc kubenswrapper[4752]: I0227 17:52:29.051941 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tvqkn" podUID="acea6aaf-6bcb-4b61-9297-90d2784b2505" containerName="registry-server" containerID="cri-o://c470e6f44294089bbb631c430d1cc0e299d5b9dae1c99c758f0df63b02dc3d52" gracePeriod=2 Feb 27 17:52:29 crc kubenswrapper[4752]: I0227 17:52:29.539211 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tvqkn" Feb 27 17:52:29 crc kubenswrapper[4752]: I0227 17:52:29.586082 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72vpq\" (UniqueName: \"kubernetes.io/projected/acea6aaf-6bcb-4b61-9297-90d2784b2505-kube-api-access-72vpq\") pod \"acea6aaf-6bcb-4b61-9297-90d2784b2505\" (UID: \"acea6aaf-6bcb-4b61-9297-90d2784b2505\") " Feb 27 17:52:29 crc kubenswrapper[4752]: I0227 17:52:29.586469 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acea6aaf-6bcb-4b61-9297-90d2784b2505-utilities\") pod \"acea6aaf-6bcb-4b61-9297-90d2784b2505\" (UID: \"acea6aaf-6bcb-4b61-9297-90d2784b2505\") " Feb 27 17:52:29 crc kubenswrapper[4752]: I0227 17:52:29.587442 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acea6aaf-6bcb-4b61-9297-90d2784b2505-utilities" (OuterVolumeSpecName: "utilities") pod "acea6aaf-6bcb-4b61-9297-90d2784b2505" (UID: "acea6aaf-6bcb-4b61-9297-90d2784b2505"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 17:52:29 crc kubenswrapper[4752]: I0227 17:52:29.599944 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acea6aaf-6bcb-4b61-9297-90d2784b2505-kube-api-access-72vpq" (OuterVolumeSpecName: "kube-api-access-72vpq") pod "acea6aaf-6bcb-4b61-9297-90d2784b2505" (UID: "acea6aaf-6bcb-4b61-9297-90d2784b2505"). InnerVolumeSpecName "kube-api-access-72vpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:52:29 crc kubenswrapper[4752]: I0227 17:52:29.687613 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acea6aaf-6bcb-4b61-9297-90d2784b2505-catalog-content\") pod \"acea6aaf-6bcb-4b61-9297-90d2784b2505\" (UID: \"acea6aaf-6bcb-4b61-9297-90d2784b2505\") " Feb 27 17:52:29 crc kubenswrapper[4752]: I0227 17:52:29.688086 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72vpq\" (UniqueName: \"kubernetes.io/projected/acea6aaf-6bcb-4b61-9297-90d2784b2505-kube-api-access-72vpq\") on node \"crc\" DevicePath \"\"" Feb 27 17:52:29 crc kubenswrapper[4752]: I0227 17:52:29.688129 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acea6aaf-6bcb-4b61-9297-90d2784b2505-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 17:52:29 crc kubenswrapper[4752]: I0227 17:52:29.765091 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acea6aaf-6bcb-4b61-9297-90d2784b2505-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "acea6aaf-6bcb-4b61-9297-90d2784b2505" (UID: "acea6aaf-6bcb-4b61-9297-90d2784b2505"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 17:52:29 crc kubenswrapper[4752]: I0227 17:52:29.789335 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acea6aaf-6bcb-4b61-9297-90d2784b2505-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 17:52:30 crc kubenswrapper[4752]: I0227 17:52:30.066610 4752 generic.go:334] "Generic (PLEG): container finished" podID="acea6aaf-6bcb-4b61-9297-90d2784b2505" containerID="c470e6f44294089bbb631c430d1cc0e299d5b9dae1c99c758f0df63b02dc3d52" exitCode=0 Feb 27 17:52:30 crc kubenswrapper[4752]: I0227 17:52:30.066661 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tvqkn" event={"ID":"acea6aaf-6bcb-4b61-9297-90d2784b2505","Type":"ContainerDied","Data":"c470e6f44294089bbb631c430d1cc0e299d5b9dae1c99c758f0df63b02dc3d52"} Feb 27 17:52:30 crc kubenswrapper[4752]: I0227 17:52:30.066705 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tvqkn" event={"ID":"acea6aaf-6bcb-4b61-9297-90d2784b2505","Type":"ContainerDied","Data":"01132c98115aab456f460799b30d70eb612a0c1409091e9f3ad10b84c970830f"} Feb 27 17:52:30 crc kubenswrapper[4752]: I0227 17:52:30.066731 4752 scope.go:117] "RemoveContainer" containerID="c470e6f44294089bbb631c430d1cc0e299d5b9dae1c99c758f0df63b02dc3d52" Feb 27 17:52:30 crc kubenswrapper[4752]: I0227 17:52:30.066744 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tvqkn" Feb 27 17:52:30 crc kubenswrapper[4752]: I0227 17:52:30.090990 4752 scope.go:117] "RemoveContainer" containerID="598c0fa28cdc93e5d142497dbf1473bdab1d81c00d55c2654fd23df36c490680" Feb 27 17:52:30 crc kubenswrapper[4752]: I0227 17:52:30.122903 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tvqkn"] Feb 27 17:52:30 crc kubenswrapper[4752]: I0227 17:52:30.127766 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tvqkn"] Feb 27 17:52:30 crc kubenswrapper[4752]: I0227 17:52:30.133984 4752 scope.go:117] "RemoveContainer" containerID="58c95cfa681644d9144493ecd5b0247787e27d79f35c8f14cb7b397061fe2ff4" Feb 27 17:52:30 crc kubenswrapper[4752]: I0227 17:52:30.154465 4752 scope.go:117] "RemoveContainer" containerID="c470e6f44294089bbb631c430d1cc0e299d5b9dae1c99c758f0df63b02dc3d52" Feb 27 17:52:30 crc kubenswrapper[4752]: E0227 17:52:30.154794 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c470e6f44294089bbb631c430d1cc0e299d5b9dae1c99c758f0df63b02dc3d52\": container with ID starting with c470e6f44294089bbb631c430d1cc0e299d5b9dae1c99c758f0df63b02dc3d52 not found: ID does not exist" containerID="c470e6f44294089bbb631c430d1cc0e299d5b9dae1c99c758f0df63b02dc3d52" Feb 27 17:52:30 crc kubenswrapper[4752]: I0227 17:52:30.154823 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c470e6f44294089bbb631c430d1cc0e299d5b9dae1c99c758f0df63b02dc3d52"} err="failed to get container status \"c470e6f44294089bbb631c430d1cc0e299d5b9dae1c99c758f0df63b02dc3d52\": rpc error: code = NotFound desc = could not find container \"c470e6f44294089bbb631c430d1cc0e299d5b9dae1c99c758f0df63b02dc3d52\": container with ID starting with c470e6f44294089bbb631c430d1cc0e299d5b9dae1c99c758f0df63b02dc3d52 not found: ID does not exist" Feb 27 17:52:30 crc kubenswrapper[4752]: I0227 17:52:30.154843 4752 scope.go:117] "RemoveContainer" containerID="598c0fa28cdc93e5d142497dbf1473bdab1d81c00d55c2654fd23df36c490680" Feb 27 17:52:30 crc kubenswrapper[4752]: E0227 17:52:30.155138 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"598c0fa28cdc93e5d142497dbf1473bdab1d81c00d55c2654fd23df36c490680\": container with ID starting with 598c0fa28cdc93e5d142497dbf1473bdab1d81c00d55c2654fd23df36c490680 not found: ID does not exist" containerID="598c0fa28cdc93e5d142497dbf1473bdab1d81c00d55c2654fd23df36c490680" Feb 27 17:52:30 crc kubenswrapper[4752]: I0227 17:52:30.155169 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"598c0fa28cdc93e5d142497dbf1473bdab1d81c00d55c2654fd23df36c490680"} err="failed to get container status \"598c0fa28cdc93e5d142497dbf1473bdab1d81c00d55c2654fd23df36c490680\": rpc error: code = NotFound desc = could not find container \"598c0fa28cdc93e5d142497dbf1473bdab1d81c00d55c2654fd23df36c490680\": container with ID starting with 598c0fa28cdc93e5d142497dbf1473bdab1d81c00d55c2654fd23df36c490680 not found: ID does not exist" Feb 27 17:52:30 crc kubenswrapper[4752]: I0227 17:52:30.155184 4752 scope.go:117] "RemoveContainer" containerID="58c95cfa681644d9144493ecd5b0247787e27d79f35c8f14cb7b397061fe2ff4" Feb 27 17:52:30 crc kubenswrapper[4752]: E0227 17:52:30.155573 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58c95cfa681644d9144493ecd5b0247787e27d79f35c8f14cb7b397061fe2ff4\": container with ID starting with 58c95cfa681644d9144493ecd5b0247787e27d79f35c8f14cb7b397061fe2ff4 not found: ID does not exist" containerID="58c95cfa681644d9144493ecd5b0247787e27d79f35c8f14cb7b397061fe2ff4" Feb 27 17:52:30 crc kubenswrapper[4752]: I0227 17:52:30.155593 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58c95cfa681644d9144493ecd5b0247787e27d79f35c8f14cb7b397061fe2ff4"} err="failed to get container status \"58c95cfa681644d9144493ecd5b0247787e27d79f35c8f14cb7b397061fe2ff4\": rpc error: code = NotFound desc = could not find container \"58c95cfa681644d9144493ecd5b0247787e27d79f35c8f14cb7b397061fe2ff4\": container with ID starting with 58c95cfa681644d9144493ecd5b0247787e27d79f35c8f14cb7b397061fe2ff4 not found: ID does not exist" Feb 27 17:52:30 crc kubenswrapper[4752]: I0227 17:52:30.920017 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acea6aaf-6bcb-4b61-9297-90d2784b2505" path="/var/lib/kubelet/pods/acea6aaf-6bcb-4b61-9297-90d2784b2505/volumes" Feb 27 17:52:31 crc kubenswrapper[4752]: E0227 17:52:31.868733 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256=e7b5bc003ac75bc62f0dcf89e3c906788a034a5a5c28321afa9e3e8773559108/signature-11: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5" Feb 27 17:52:31 crc kubenswrapper[4752]: E0227 17:52:31.868971 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:nmstate-metrics,Image:registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5,Command:[manager],Args:[--zap-time-encoding=iso8601],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:RUN_METRICS_MANAGER,Value:,ValueFrom:nil,},EnvVar{Name:OPERATOR_NAME,Value:nmstate,ValueFrom:nil,},EnvVar{Name:ENABLE_PROFILER,Value:False,ValueFrom:nil,},EnvVar{Name:PROFILER_PORT,Value:6060,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{30 -3} {} 30m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-28fvz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000690000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nmstate-metrics-69594cc75-l2g98_openshift-nmstate(592dcc36-2b17-4a10-b182-b693490e83c7): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256=e7b5bc003ac75bc62f0dcf89e3c906788a034a5a5c28321afa9e3e8773559108/signature-11: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 17:52:32 crc kubenswrapper[4752]: E0227 17:52:32.960469 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-kube-rbac-proxy-rhel9@sha256=8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501/signature-11: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-kube-rbac-proxy-rhel9@sha256:787be45b5241419b6819676d43325a9030c0e16441918e4a33a44f0380d6b902" Feb 27 17:52:32 crc kubenswrapper[4752]: E0227 17:52:32.960959 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:registry.redhat.io/openshift4/ose-kube-rbac-proxy-rhel9@sha256:787be45b5241419b6819676d43325a9030c0e16441918e4a33a44f0380d6b902,Command:[],Args:[--logtostderr --secure-listen-address=:8443 --upstream=http://127.0.0.1:8089],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-28fvz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000690000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nmstate-metrics-69594cc75-l2g98_openshift-nmstate(592dcc36-2b17-4a10-b182-b693490e83c7): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-kube-rbac-proxy-rhel9@sha256=8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501/signature-11: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 17:52:32 crc kubenswrapper[4752]: E0227 17:52:32.962262 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"nmstate-metrics\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256=e7b5bc003ac75bc62f0dcf89e3c906788a034a5a5c28321afa9e3e8773559108/signature-11: status 500 (Internal Server Error)\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-kube-rbac-proxy-rhel9@sha256=8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501/signature-11: status 500 (Internal Server Error)\"]" pod="openshift-nmstate/nmstate-metrics-69594cc75-l2g98" podUID="592dcc36-2b17-4a10-b182-b693490e83c7" Feb 27 17:52:33 crc kubenswrapper[4752]: E0227 17:52:33.089753 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"nmstate-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-kube-rbac-proxy-rhel9@sha256:787be45b5241419b6819676d43325a9030c0e16441918e4a33a44f0380d6b902\\\"\"]" pod="openshift-nmstate/nmstate-metrics-69594cc75-l2g98" podUID="592dcc36-2b17-4a10-b182-b693490e83c7" Feb 27 17:52:36 crc kubenswrapper[4752]: I0227 17:52:36.324100 4752 patch_prober.go:28] interesting pod/machine-config-daemon-cm8wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 17:52:36 crc kubenswrapper[4752]: I0227 17:52:36.324406 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 17:52:47 crc kubenswrapper[4752]: E0227 17:52:47.911382 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"nmstate-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-kube-rbac-proxy-rhel9@sha256:787be45b5241419b6819676d43325a9030c0e16441918e4a33a44f0380d6b902\\\"\"]" pod="openshift-nmstate/nmstate-metrics-69594cc75-l2g98" podUID="592dcc36-2b17-4a10-b182-b693490e83c7" Feb 27 17:53:01 crc kubenswrapper[4752]: I0227 17:53:01.508599 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-phmqx"] Feb 27 17:53:01 crc kubenswrapper[4752]: E0227 17:53:01.509375 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acea6aaf-6bcb-4b61-9297-90d2784b2505" containerName="extract-utilities" Feb 27 17:53:01 crc kubenswrapper[4752]: I0227 17:53:01.509388 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="acea6aaf-6bcb-4b61-9297-90d2784b2505" containerName="extract-utilities" Feb 27 17:53:01 crc kubenswrapper[4752]: E0227 17:53:01.509400 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acea6aaf-6bcb-4b61-9297-90d2784b2505" containerName="registry-server" Feb 27 17:53:01 crc kubenswrapper[4752]: I0227 17:53:01.509408 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="acea6aaf-6bcb-4b61-9297-90d2784b2505" containerName="registry-server" Feb 27 17:53:01 crc kubenswrapper[4752]: E0227 17:53:01.509419 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acea6aaf-6bcb-4b61-9297-90d2784b2505" containerName="extract-content" Feb 27 17:53:01 crc kubenswrapper[4752]: I0227 17:53:01.509427 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="acea6aaf-6bcb-4b61-9297-90d2784b2505" containerName="extract-content" Feb 27 17:53:01 crc kubenswrapper[4752]: E0227 17:53:01.509435 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bda19b6-f483-43a8-9d7e-21a8ce77fe9f" containerName="extract-utilities" Feb 27 17:53:01 crc kubenswrapper[4752]: I0227 17:53:01.509441 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bda19b6-f483-43a8-9d7e-21a8ce77fe9f" containerName="extract-utilities" Feb 27 17:53:01 crc kubenswrapper[4752]: E0227 17:53:01.509457 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b753d203-d3cb-4304-b753-4ec10344426b" containerName="oc" Feb 27 17:53:01 crc kubenswrapper[4752]: I0227 17:53:01.509463 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="b753d203-d3cb-4304-b753-4ec10344426b" containerName="oc" Feb 27 17:53:01 crc kubenswrapper[4752]: E0227 17:53:01.509472 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bda19b6-f483-43a8-9d7e-21a8ce77fe9f" containerName="extract-content" Feb 27 17:53:01 crc kubenswrapper[4752]: I0227 17:53:01.509478 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bda19b6-f483-43a8-9d7e-21a8ce77fe9f" containerName="extract-content" Feb 27 17:53:01 crc kubenswrapper[4752]: E0227 17:53:01.509488 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bda19b6-f483-43a8-9d7e-21a8ce77fe9f" containerName="registry-server" Feb 27 17:53:01 crc kubenswrapper[4752]: I0227 17:53:01.509494 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bda19b6-f483-43a8-9d7e-21a8ce77fe9f" containerName="registry-server" Feb 27 17:53:01 crc kubenswrapper[4752]: I0227 17:53:01.509615 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="acea6aaf-6bcb-4b61-9297-90d2784b2505" containerName="registry-server" Feb 27 17:53:01 crc kubenswrapper[4752]: I0227 17:53:01.509625 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="b753d203-d3cb-4304-b753-4ec10344426b" containerName="oc" Feb 27 17:53:01 crc kubenswrapper[4752]: I0227 17:53:01.509634 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bda19b6-f483-43a8-9d7e-21a8ce77fe9f" containerName="registry-server" Feb 27 17:53:01 crc kubenswrapper[4752]: I0227 17:53:01.510425 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-phmqx" Feb 27 17:53:01 crc kubenswrapper[4752]: I0227 17:53:01.541396 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-phmqx"] Feb 27 17:53:01 crc kubenswrapper[4752]: I0227 17:53:01.661919 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r42n\" (UniqueName: \"kubernetes.io/projected/2a8647bb-a173-4e43-b3f4-7f16f67deff5-kube-api-access-8r42n\") pod \"redhat-marketplace-phmqx\" (UID: \"2a8647bb-a173-4e43-b3f4-7f16f67deff5\") " pod="openshift-marketplace/redhat-marketplace-phmqx" Feb 27 17:53:01 crc kubenswrapper[4752]: I0227 17:53:01.661985 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a8647bb-a173-4e43-b3f4-7f16f67deff5-catalog-content\") pod \"redhat-marketplace-phmqx\" (UID: \"2a8647bb-a173-4e43-b3f4-7f16f67deff5\") " pod="openshift-marketplace/redhat-marketplace-phmqx" Feb 27 17:53:01 crc kubenswrapper[4752]: I0227 17:53:01.662026 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a8647bb-a173-4e43-b3f4-7f16f67deff5-utilities\") pod \"redhat-marketplace-phmqx\" (UID: \"2a8647bb-a173-4e43-b3f4-7f16f67deff5\") " pod="openshift-marketplace/redhat-marketplace-phmqx" Feb 27 17:53:01 crc kubenswrapper[4752]: I0227 17:53:01.763508 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a8647bb-a173-4e43-b3f4-7f16f67deff5-utilities\") pod \"redhat-marketplace-phmqx\" (UID: \"2a8647bb-a173-4e43-b3f4-7f16f67deff5\") " pod="openshift-marketplace/redhat-marketplace-phmqx" Feb 27 17:53:01 crc kubenswrapper[4752]: I0227 17:53:01.763701 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r42n\" (UniqueName: \"kubernetes.io/projected/2a8647bb-a173-4e43-b3f4-7f16f67deff5-kube-api-access-8r42n\") pod \"redhat-marketplace-phmqx\" (UID: \"2a8647bb-a173-4e43-b3f4-7f16f67deff5\") " pod="openshift-marketplace/redhat-marketplace-phmqx" Feb 27 17:53:01 crc kubenswrapper[4752]: I0227 17:53:01.763777 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a8647bb-a173-4e43-b3f4-7f16f67deff5-catalog-content\") pod \"redhat-marketplace-phmqx\" (UID: \"2a8647bb-a173-4e43-b3f4-7f16f67deff5\") " pod="openshift-marketplace/redhat-marketplace-phmqx" Feb 27 17:53:01 crc kubenswrapper[4752]: I0227 17:53:01.764032 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a8647bb-a173-4e43-b3f4-7f16f67deff5-utilities\") pod \"redhat-marketplace-phmqx\" (UID: \"2a8647bb-a173-4e43-b3f4-7f16f67deff5\") " pod="openshift-marketplace/redhat-marketplace-phmqx" Feb 27 17:53:01 crc kubenswrapper[4752]: I0227 17:53:01.764611 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a8647bb-a173-4e43-b3f4-7f16f67deff5-catalog-content\") pod \"redhat-marketplace-phmqx\" (UID: \"2a8647bb-a173-4e43-b3f4-7f16f67deff5\") " pod="openshift-marketplace/redhat-marketplace-phmqx" Feb 27 17:53:01 crc kubenswrapper[4752]: I0227 17:53:01.799227 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r42n\" (UniqueName: \"kubernetes.io/projected/2a8647bb-a173-4e43-b3f4-7f16f67deff5-kube-api-access-8r42n\") pod \"redhat-marketplace-phmqx\" (UID: \"2a8647bb-a173-4e43-b3f4-7f16f67deff5\") " pod="openshift-marketplace/redhat-marketplace-phmqx" Feb 27 17:53:01 crc kubenswrapper[4752]: I0227 17:53:01.844446 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-phmqx" Feb 27 17:53:02 crc kubenswrapper[4752]: I0227 17:53:02.158932 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-phmqx"] Feb 27 17:53:02 crc kubenswrapper[4752]: W0227 17:53:02.165934 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a8647bb_a173_4e43_b3f4_7f16f67deff5.slice/crio-721eab114ac00861fb6ea98a3a9a5e5e1f6cfdb03b030efd3962fd6d718991ff WatchSource:0}: Error finding container 721eab114ac00861fb6ea98a3a9a5e5e1f6cfdb03b030efd3962fd6d718991ff: Status 404 returned error can't find the container with id 721eab114ac00861fb6ea98a3a9a5e5e1f6cfdb03b030efd3962fd6d718991ff Feb 27 17:53:02 crc kubenswrapper[4752]: E0227 17:53:02.242694 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256=e7b5bc003ac75bc62f0dcf89e3c906788a034a5a5c28321afa9e3e8773559108/signature-11: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5" Feb 27 17:53:02 crc kubenswrapper[4752]: E0227 17:53:02.243039 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:nmstate-metrics,Image:registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5,Command:[manager],Args:[--zap-time-encoding=iso8601],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:RUN_METRICS_MANAGER,Value:,ValueFrom:nil,},EnvVar{Name:OPERATOR_NAME,Value:nmstate,ValueFrom:nil,},EnvVar{Name:ENABLE_PROFILER,Value:False,ValueFrom:nil,},EnvVar{Name:PROFILER_PORT,Value:6060,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{30 -3} {} 30m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-28fvz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000690000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nmstate-metrics-69594cc75-l2g98_openshift-nmstate(592dcc36-2b17-4a10-b182-b693490e83c7): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256=e7b5bc003ac75bc62f0dcf89e3c906788a034a5a5c28321afa9e3e8773559108/signature-11: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 17:53:02 crc kubenswrapper[4752]: I0227 17:53:02.301397 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-phmqx" event={"ID":"2a8647bb-a173-4e43-b3f4-7f16f67deff5","Type":"ContainerStarted","Data":"721eab114ac00861fb6ea98a3a9a5e5e1f6cfdb03b030efd3962fd6d718991ff"} Feb 27 17:53:03 crc kubenswrapper[4752]: I0227 17:53:03.144653 4752 scope.go:117] "RemoveContainer" containerID="43ed2cb25ef8cbe5b9ae4d0d046dfce4e123bd2d0f29b45c8e6c1003cfd24efa" Feb 27 17:53:03 crc kubenswrapper[4752]: I0227 17:53:03.311909 4752 generic.go:334] "Generic (PLEG): container finished" podID="2a8647bb-a173-4e43-b3f4-7f16f67deff5" containerID="dbf56930ca3de4579315311a6e22d81274947bed501d5a1b3ee390de4cdd2b42" exitCode=0 Feb 27 17:53:03 crc kubenswrapper[4752]: I0227 17:53:03.312022 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-phmqx" event={"ID":"2a8647bb-a173-4e43-b3f4-7f16f67deff5","Type":"ContainerDied","Data":"dbf56930ca3de4579315311a6e22d81274947bed501d5a1b3ee390de4cdd2b42"} Feb 27 17:53:04 crc kubenswrapper[4752]: E0227 17:53:04.052870 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error)" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 27 17:53:04 crc kubenswrapper[4752]: E0227 17:53:04.053018 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8r42n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-phmqx_openshift-marketplace(2a8647bb-a173-4e43-b3f4-7f16f67deff5): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 17:53:04 crc kubenswrapper[4752]: E0227 17:53:04.054396 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error)\"" pod="openshift-marketplace/redhat-marketplace-phmqx" podUID="2a8647bb-a173-4e43-b3f4-7f16f67deff5" Feb 27 17:53:04 crc kubenswrapper[4752]: E0227 17:53:04.325574 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-phmqx" podUID="2a8647bb-a173-4e43-b3f4-7f16f67deff5" Feb 27 17:53:06 crc kubenswrapper[4752]: I0227 17:53:06.324135 4752 patch_prober.go:28] interesting pod/machine-config-daemon-cm8wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 17:53:06 crc kubenswrapper[4752]: I0227 17:53:06.324677 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 17:53:06 crc kubenswrapper[4752]: I0227 17:53:06.324749 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" Feb 27 17:53:06 crc kubenswrapper[4752]: I0227 17:53:06.325678 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9dcaf0a23a37ae06ee6e0942e328a8ecbd6e4ed2d990fbabc5b07156fcd3f846"} pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 17:53:06 crc kubenswrapper[4752]: I0227 17:53:06.325779 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" containerName="machine-config-daemon" containerID="cri-o://9dcaf0a23a37ae06ee6e0942e328a8ecbd6e4ed2d990fbabc5b07156fcd3f846" gracePeriod=600 Feb 27 17:53:07 crc kubenswrapper[4752]: I0227 17:53:07.355273 4752 generic.go:334] "Generic (PLEG): container finished" podID="53ce186c-640f-4ade-94e1-587c1440fe87" containerID="9dcaf0a23a37ae06ee6e0942e328a8ecbd6e4ed2d990fbabc5b07156fcd3f846" exitCode=0 Feb 27 17:53:07 crc kubenswrapper[4752]: I0227 17:53:07.357304 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" event={"ID":"53ce186c-640f-4ade-94e1-587c1440fe87","Type":"ContainerDied","Data":"9dcaf0a23a37ae06ee6e0942e328a8ecbd6e4ed2d990fbabc5b07156fcd3f846"} Feb 27 17:53:07 crc kubenswrapper[4752]: I0227 17:53:07.357480 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" event={"ID":"53ce186c-640f-4ade-94e1-587c1440fe87","Type":"ContainerStarted","Data":"4e7de8fe350c8a0eff755a90f23174bdf3c2f2e4407f67fe8eede9d946c66a13"} Feb 27 17:53:07 crc kubenswrapper[4752]: I0227 17:53:07.357555 4752 scope.go:117] "RemoveContainer" containerID="a53f865de5bd7bff88a289306c2cd6d9f814402e7d74b87753c7f92b7f4a7a83" Feb 27 17:53:15 crc kubenswrapper[4752]: E0227 17:53:15.543826 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error)" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 27 17:53:15 crc kubenswrapper[4752]: E0227 17:53:15.544699 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8r42n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-phmqx_openshift-marketplace(2a8647bb-a173-4e43-b3f4-7f16f67deff5): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 17:53:15 crc kubenswrapper[4752]: E0227 17:53:15.545992 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error)\"" pod="openshift-marketplace/redhat-marketplace-phmqx" podUID="2a8647bb-a173-4e43-b3f4-7f16f67deff5" Feb 27 17:53:17 crc kubenswrapper[4752]: E0227 17:53:17.902598 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nmstate-metrics\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256=e7b5bc003ac75bc62f0dcf89e3c906788a034a5a5c28321afa9e3e8773559108/signature-11: status 500 (Internal Server Error)\"" pod="openshift-nmstate/nmstate-metrics-69594cc75-l2g98" podUID="592dcc36-2b17-4a10-b182-b693490e83c7" Feb 27 17:53:18 crc kubenswrapper[4752]: I0227 17:53:18.432493 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-l2g98" event={"ID":"592dcc36-2b17-4a10-b182-b693490e83c7","Type":"ContainerStarted","Data":"becbb71a79abe629f50f030c51f3d8872e36e81561ad57403e91b8e4805e2b76"} Feb 27 17:53:18 crc kubenswrapper[4752]: E0227 17:53:18.434380 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nmstate-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5\\\"\"" pod="openshift-nmstate/nmstate-metrics-69594cc75-l2g98" podUID="592dcc36-2b17-4a10-b182-b693490e83c7" Feb 27 17:53:27 crc kubenswrapper[4752]: E0227 17:53:27.908597 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-phmqx" podUID="2a8647bb-a173-4e43-b3f4-7f16f67deff5" Feb 27 17:53:29 crc kubenswrapper[4752]: E0227 17:53:29.630036 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256=e7b5bc003ac75bc62f0dcf89e3c906788a034a5a5c28321afa9e3e8773559108/signature-11: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5" Feb 27 17:53:29 crc kubenswrapper[4752]: E0227 17:53:29.630550 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:nmstate-handler,Image:registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5,Command:[manager],Args:[--zap-time-encoding=iso8601],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:COMPONENT,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.labels['app.kubernetes.io/component'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:PART_OF,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.labels['app.kubernetes.io/part-of'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:VERSION,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.labels['app.kubernetes.io/version'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:MANAGED_BY,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.labels['app.kubernetes.io/managed-by'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:OPERATOR_NAME,Value:nmstate,ValueFrom:nil,},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:ENABLE_PROFILER,Value:False,ValueFrom:nil,},EnvVar{Name:PROFILER_PORT,Value:6060,ValueFrom:nil,},EnvVar{Name:NMSTATE_INSTANCE_NODE_LOCK_FILE,Value:/var/k8s_nmstate/handler_lock,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{104857600 0} {} 100Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:dbus-socket,ReadOnly:false,MountPath:/run/dbus/system_bus_socket,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:nmstate-lock,ReadOnly:false,MountPath:/var/k8s_nmstate,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovs-socket,ReadOnly:false,MountPath:/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xm9bv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[cat /tmp/healthy],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nmstate-handler-7wpqr_openshift-nmstate(5fc4e66a-972d-4516-96ba-fa4b56a181a0): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256=e7b5bc003ac75bc62f0dcf89e3c906788a034a5a5c28321afa9e3e8773559108/signature-11: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 17:53:29 crc kubenswrapper[4752]: E0227 17:53:29.631800 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nmstate-handler\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256=e7b5bc003ac75bc62f0dcf89e3c906788a034a5a5c28321afa9e3e8773559108/signature-11: status 500 (Internal Server Error)\"" pod="openshift-nmstate/nmstate-handler-7wpqr" podUID="5fc4e66a-972d-4516-96ba-fa4b56a181a0" Feb 27 17:53:30 crc kubenswrapper[4752]: E0227 17:53:30.913702 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nmstate-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5\\\"\"" pod="openshift-nmstate/nmstate-metrics-69594cc75-l2g98" podUID="592dcc36-2b17-4a10-b182-b693490e83c7" Feb 27 17:53:40 crc kubenswrapper[4752]: E0227 17:53:40.915593 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nmstate-handler\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5\\\"\"" pod="openshift-nmstate/nmstate-handler-7wpqr" podUID="5fc4e66a-972d-4516-96ba-fa4b56a181a0" Feb 27 17:53:43 crc kubenswrapper[4752]: E0227 17:53:43.536890 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error)" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 27 17:53:43 crc kubenswrapper[4752]: E0227 17:53:43.537469 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8r42n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-phmqx_openshift-marketplace(2a8647bb-a173-4e43-b3f4-7f16f67deff5): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 17:53:43 crc kubenswrapper[4752]: E0227 17:53:43.538746 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error)\"" pod="openshift-marketplace/redhat-marketplace-phmqx" podUID="2a8647bb-a173-4e43-b3f4-7f16f67deff5" Feb 27 17:53:48 crc kubenswrapper[4752]: E0227 17:53:48.191421 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256=e7b5bc003ac75bc62f0dcf89e3c906788a034a5a5c28321afa9e3e8773559108/signature-11: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5" Feb 27 17:53:48 crc kubenswrapper[4752]: E0227 17:53:48.191984 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:nmstate-metrics,Image:registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5,Command:[manager],Args:[--zap-time-encoding=iso8601],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:RUN_METRICS_MANAGER,Value:,ValueFrom:nil,},EnvVar{Name:OPERATOR_NAME,Value:nmstate,ValueFrom:nil,},EnvVar{Name:ENABLE_PROFILER,Value:False,ValueFrom:nil,},EnvVar{Name:PROFILER_PORT,Value:6060,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{30 -3} {} 30m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-28fvz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000690000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nmstate-metrics-69594cc75-l2g98_openshift-nmstate(592dcc36-2b17-4a10-b182-b693490e83c7): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256=e7b5bc003ac75bc62f0dcf89e3c906788a034a5a5c28321afa9e3e8773559108/signature-11: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 17:53:48 crc kubenswrapper[4752]: E0227 17:53:48.193326 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nmstate-metrics\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256=e7b5bc003ac75bc62f0dcf89e3c906788a034a5a5c28321afa9e3e8773559108/signature-11: status 500 (Internal Server Error)\"" pod="openshift-nmstate/nmstate-metrics-69594cc75-l2g98" podUID="592dcc36-2b17-4a10-b182-b693490e83c7" Feb 27 17:53:54 crc kubenswrapper[4752]: E0227 17:53:54.911713 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nmstate-handler\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5\\\"\"" pod="openshift-nmstate/nmstate-handler-7wpqr" podUID="5fc4e66a-972d-4516-96ba-fa4b56a181a0" Feb 27 17:53:57 crc kubenswrapper[4752]: E0227 17:53:57.910429 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-phmqx" podUID="2a8647bb-a173-4e43-b3f4-7f16f67deff5" Feb 27 17:53:59 crc kubenswrapper[4752]: E0227 17:53:59.910797 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nmstate-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5\\\"\"" pod="openshift-nmstate/nmstate-metrics-69594cc75-l2g98" podUID="592dcc36-2b17-4a10-b182-b693490e83c7" Feb 27 17:54:00 crc kubenswrapper[4752]: I0227 17:54:00.151793 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536914-6jdgn"] Feb 27 17:54:00 crc kubenswrapper[4752]: I0227 17:54:00.152515 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536914-6jdgn" Feb 27 17:54:00 crc kubenswrapper[4752]: I0227 17:54:00.155582 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 17:54:00 crc kubenswrapper[4752]: I0227 17:54:00.155964 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wtt6d" Feb 27 17:54:00 crc kubenswrapper[4752]: I0227 17:54:00.156208 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 17:54:00 crc kubenswrapper[4752]: I0227 17:54:00.177920 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536914-6jdgn"] Feb 27 17:54:00 crc kubenswrapper[4752]: I0227 17:54:00.315699 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnvt8\" (UniqueName: \"kubernetes.io/projected/60b83e90-2de3-4492-a7fe-1593d2c90af0-kube-api-access-mnvt8\") pod \"auto-csr-approver-29536914-6jdgn\" (UID: \"60b83e90-2de3-4492-a7fe-1593d2c90af0\") " pod="openshift-infra/auto-csr-approver-29536914-6jdgn" Feb 27 17:54:00 crc kubenswrapper[4752]: I0227 17:54:00.418338 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnvt8\" (UniqueName: \"kubernetes.io/projected/60b83e90-2de3-4492-a7fe-1593d2c90af0-kube-api-access-mnvt8\") pod \"auto-csr-approver-29536914-6jdgn\" (UID: \"60b83e90-2de3-4492-a7fe-1593d2c90af0\") " pod="openshift-infra/auto-csr-approver-29536914-6jdgn" Feb 27 17:54:00 crc kubenswrapper[4752]: I0227 17:54:00.453041 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnvt8\" (UniqueName: \"kubernetes.io/projected/60b83e90-2de3-4492-a7fe-1593d2c90af0-kube-api-access-mnvt8\") pod \"auto-csr-approver-29536914-6jdgn\" (UID: \"60b83e90-2de3-4492-a7fe-1593d2c90af0\") " pod="openshift-infra/auto-csr-approver-29536914-6jdgn" Feb 27 17:54:00 crc kubenswrapper[4752]: I0227 17:54:00.478358 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536914-6jdgn" Feb 27 17:54:00 crc kubenswrapper[4752]: I0227 17:54:00.774357 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536914-6jdgn"] Feb 27 17:54:01 crc kubenswrapper[4752]: I0227 17:54:01.738265 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536914-6jdgn" event={"ID":"60b83e90-2de3-4492-a7fe-1593d2c90af0","Type":"ContainerStarted","Data":"c04e6ad147a70394f385a73f5c239fda93847a9a170cad0b4766acc1da66c6fa"} Feb 27 17:54:02 crc kubenswrapper[4752]: I0227 17:54:02.747902 4752 generic.go:334] "Generic (PLEG): container finished" podID="60b83e90-2de3-4492-a7fe-1593d2c90af0" containerID="735aa090e8a47d3db26ba5d3f5cbd42dcbc66508994cd9cb5bd0284908ef198d" exitCode=0 Feb 27 17:54:02 crc kubenswrapper[4752]: I0227 17:54:02.747987 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536914-6jdgn" event={"ID":"60b83e90-2de3-4492-a7fe-1593d2c90af0","Type":"ContainerDied","Data":"735aa090e8a47d3db26ba5d3f5cbd42dcbc66508994cd9cb5bd0284908ef198d"} Feb 27 17:54:04 crc kubenswrapper[4752]: I0227 17:54:04.046230 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536914-6jdgn" Feb 27 17:54:04 crc kubenswrapper[4752]: I0227 17:54:04.198859 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnvt8\" (UniqueName: \"kubernetes.io/projected/60b83e90-2de3-4492-a7fe-1593d2c90af0-kube-api-access-mnvt8\") pod \"60b83e90-2de3-4492-a7fe-1593d2c90af0\" (UID: \"60b83e90-2de3-4492-a7fe-1593d2c90af0\") " Feb 27 17:54:04 crc kubenswrapper[4752]: I0227 17:54:04.207140 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60b83e90-2de3-4492-a7fe-1593d2c90af0-kube-api-access-mnvt8" (OuterVolumeSpecName: "kube-api-access-mnvt8") pod "60b83e90-2de3-4492-a7fe-1593d2c90af0" (UID: "60b83e90-2de3-4492-a7fe-1593d2c90af0"). InnerVolumeSpecName "kube-api-access-mnvt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:54:04 crc kubenswrapper[4752]: I0227 17:54:04.300980 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnvt8\" (UniqueName: \"kubernetes.io/projected/60b83e90-2de3-4492-a7fe-1593d2c90af0-kube-api-access-mnvt8\") on node \"crc\" DevicePath \"\"" Feb 27 17:54:04 crc kubenswrapper[4752]: I0227 17:54:04.765814 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536914-6jdgn" event={"ID":"60b83e90-2de3-4492-a7fe-1593d2c90af0","Type":"ContainerDied","Data":"c04e6ad147a70394f385a73f5c239fda93847a9a170cad0b4766acc1da66c6fa"} Feb 27 17:54:04 crc kubenswrapper[4752]: I0227 17:54:04.765892 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c04e6ad147a70394f385a73f5c239fda93847a9a170cad0b4766acc1da66c6fa" Feb 27 17:54:04 crc kubenswrapper[4752]: I0227 17:54:04.765927 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536914-6jdgn" Feb 27 17:54:05 crc kubenswrapper[4752]: I0227 17:54:05.123087 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536908-zvbsb"] Feb 27 17:54:05 crc kubenswrapper[4752]: I0227 17:54:05.130996 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536908-zvbsb"] Feb 27 17:54:05 crc kubenswrapper[4752]: E0227 17:54:05.910993 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nmstate-handler\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5\\\"\"" pod="openshift-nmstate/nmstate-handler-7wpqr" podUID="5fc4e66a-972d-4516-96ba-fa4b56a181a0" Feb 27 17:54:06 crc kubenswrapper[4752]: I0227 17:54:06.919775 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4babdb15-835b-4965-9af0-4a697c85f645" path="/var/lib/kubelet/pods/4babdb15-835b-4965-9af0-4a697c85f645/volumes" Feb 27 17:54:09 crc kubenswrapper[4752]: E0227 17:54:09.911035 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-phmqx" podUID="2a8647bb-a173-4e43-b3f4-7f16f67deff5" Feb 27 17:54:11 crc kubenswrapper[4752]: E0227 17:54:11.910416 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nmstate-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5\\\"\"" pod="openshift-nmstate/nmstate-metrics-69594cc75-l2g98" podUID="592dcc36-2b17-4a10-b182-b693490e83c7" Feb 27 17:54:16 crc kubenswrapper[4752]: E0227 17:54:16.911524 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nmstate-handler\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5\\\"\"" pod="openshift-nmstate/nmstate-handler-7wpqr" podUID="5fc4e66a-972d-4516-96ba-fa4b56a181a0" Feb 27 17:54:24 crc kubenswrapper[4752]: E0227 17:54:24.907858 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nmstate-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5\\\"\"" pod="openshift-nmstate/nmstate-metrics-69594cc75-l2g98" podUID="592dcc36-2b17-4a10-b182-b693490e83c7" Feb 27 17:54:24 crc kubenswrapper[4752]: I0227 17:54:24.932526 4752 generic.go:334] "Generic (PLEG): container finished" podID="2a8647bb-a173-4e43-b3f4-7f16f67deff5" containerID="04d8a2f54f3ff36b08df932350db8475ef56d44c5880b877bc453b51b23706f7" exitCode=0 Feb 27 17:54:24 crc kubenswrapper[4752]: I0227 17:54:24.932584 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-phmqx" event={"ID":"2a8647bb-a173-4e43-b3f4-7f16f67deff5","Type":"ContainerDied","Data":"04d8a2f54f3ff36b08df932350db8475ef56d44c5880b877bc453b51b23706f7"} Feb 27 17:54:25 crc kubenswrapper[4752]: I0227 17:54:25.939820 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-phmqx" event={"ID":"2a8647bb-a173-4e43-b3f4-7f16f67deff5","Type":"ContainerStarted","Data":"bdf15e224d8d46f97f022d02615d985a2db2e812e673e63355ac93e57dc17632"} Feb 27 17:54:25 crc kubenswrapper[4752]: I0227 17:54:25.967863 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-phmqx" podStartSLOduration=2.947356158 podStartE2EDuration="1m24.967839397s" podCreationTimestamp="2026-02-27 17:53:01 +0000 UTC" firstStartedPulling="2026-02-27 17:53:03.314396935 +0000 UTC m=+1083.221213826" lastFinishedPulling="2026-02-27 17:54:25.334880164 +0000 UTC m=+1165.241697065" observedRunningTime="2026-02-27 17:54:25.963324522 +0000 UTC m=+1165.870141413" watchObservedRunningTime="2026-02-27 17:54:25.967839397 +0000 UTC m=+1165.874656288" Feb 27 17:54:31 crc kubenswrapper[4752]: I0227 17:54:31.845610 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-phmqx" Feb 27 17:54:31 crc kubenswrapper[4752]: I0227 17:54:31.846443 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-phmqx" Feb 27 17:54:31 crc kubenswrapper[4752]: I0227 17:54:31.897536 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-phmqx" Feb 27 17:54:31 crc kubenswrapper[4752]: E0227 17:54:31.908872 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nmstate-handler\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5\\\"\"" pod="openshift-nmstate/nmstate-handler-7wpqr" podUID="5fc4e66a-972d-4516-96ba-fa4b56a181a0" Feb 27 17:54:32 crc kubenswrapper[4752]: I0227 17:54:32.045235 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-phmqx" Feb 27 17:54:32 crc kubenswrapper[4752]: I0227 17:54:32.176704 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-phmqx"] Feb 27 17:54:33 crc kubenswrapper[4752]: I0227 17:54:33.990086 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-phmqx" podUID="2a8647bb-a173-4e43-b3f4-7f16f67deff5" containerName="registry-server" containerID="cri-o://bdf15e224d8d46f97f022d02615d985a2db2e812e673e63355ac93e57dc17632" gracePeriod=2 Feb 27 17:54:34 crc kubenswrapper[4752]: I0227 17:54:34.471695 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-phmqx" Feb 27 17:54:34 crc kubenswrapper[4752]: I0227 17:54:34.646424 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8r42n\" (UniqueName: \"kubernetes.io/projected/2a8647bb-a173-4e43-b3f4-7f16f67deff5-kube-api-access-8r42n\") pod \"2a8647bb-a173-4e43-b3f4-7f16f67deff5\" (UID: \"2a8647bb-a173-4e43-b3f4-7f16f67deff5\") " Feb 27 17:54:34 crc kubenswrapper[4752]: I0227 17:54:34.646630 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a8647bb-a173-4e43-b3f4-7f16f67deff5-catalog-content\") pod \"2a8647bb-a173-4e43-b3f4-7f16f67deff5\" (UID: \"2a8647bb-a173-4e43-b3f4-7f16f67deff5\") " Feb 27 17:54:34 crc kubenswrapper[4752]: I0227 17:54:34.646662 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a8647bb-a173-4e43-b3f4-7f16f67deff5-utilities\") pod \"2a8647bb-a173-4e43-b3f4-7f16f67deff5\" (UID: \"2a8647bb-a173-4e43-b3f4-7f16f67deff5\") " Feb 27 17:54:34 crc kubenswrapper[4752]: I0227 17:54:34.648362 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a8647bb-a173-4e43-b3f4-7f16f67deff5-utilities" (OuterVolumeSpecName: "utilities") pod "2a8647bb-a173-4e43-b3f4-7f16f67deff5" (UID: "2a8647bb-a173-4e43-b3f4-7f16f67deff5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 17:54:34 crc kubenswrapper[4752]: I0227 17:54:34.652703 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a8647bb-a173-4e43-b3f4-7f16f67deff5-kube-api-access-8r42n" (OuterVolumeSpecName: "kube-api-access-8r42n") pod "2a8647bb-a173-4e43-b3f4-7f16f67deff5" (UID: "2a8647bb-a173-4e43-b3f4-7f16f67deff5"). InnerVolumeSpecName "kube-api-access-8r42n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:54:34 crc kubenswrapper[4752]: I0227 17:54:34.707507 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a8647bb-a173-4e43-b3f4-7f16f67deff5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2a8647bb-a173-4e43-b3f4-7f16f67deff5" (UID: "2a8647bb-a173-4e43-b3f4-7f16f67deff5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 17:54:34 crc kubenswrapper[4752]: I0227 17:54:34.749018 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a8647bb-a173-4e43-b3f4-7f16f67deff5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 17:54:34 crc kubenswrapper[4752]: I0227 17:54:34.749080 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a8647bb-a173-4e43-b3f4-7f16f67deff5-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 17:54:34 crc kubenswrapper[4752]: I0227 17:54:34.749106 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8r42n\" (UniqueName: \"kubernetes.io/projected/2a8647bb-a173-4e43-b3f4-7f16f67deff5-kube-api-access-8r42n\") on node \"crc\" DevicePath \"\"" Feb 27 17:54:34 crc kubenswrapper[4752]: I0227 17:54:34.998739 4752 generic.go:334] "Generic (PLEG): container finished" podID="2a8647bb-a173-4e43-b3f4-7f16f67deff5" containerID="bdf15e224d8d46f97f022d02615d985a2db2e812e673e63355ac93e57dc17632" exitCode=0 Feb 27 17:54:34 crc kubenswrapper[4752]: I0227 17:54:34.998800 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-phmqx" event={"ID":"2a8647bb-a173-4e43-b3f4-7f16f67deff5","Type":"ContainerDied","Data":"bdf15e224d8d46f97f022d02615d985a2db2e812e673e63355ac93e57dc17632"} Feb 27 17:54:35 crc kubenswrapper[4752]: I0227 17:54:34.998842 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-phmqx" event={"ID":"2a8647bb-a173-4e43-b3f4-7f16f67deff5","Type":"ContainerDied","Data":"721eab114ac00861fb6ea98a3a9a5e5e1f6cfdb03b030efd3962fd6d718991ff"} Feb 27 17:54:35 crc kubenswrapper[4752]: I0227 17:54:34.998869 4752 scope.go:117] "RemoveContainer" containerID="bdf15e224d8d46f97f022d02615d985a2db2e812e673e63355ac93e57dc17632" Feb 27 17:54:35 crc kubenswrapper[4752]: I0227 17:54:34.998986 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-phmqx" Feb 27 17:54:35 crc kubenswrapper[4752]: I0227 17:54:35.025599 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-phmqx"] Feb 27 17:54:35 crc kubenswrapper[4752]: I0227 17:54:35.029638 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-phmqx"] Feb 27 17:54:35 crc kubenswrapper[4752]: I0227 17:54:35.035985 4752 scope.go:117] "RemoveContainer" containerID="04d8a2f54f3ff36b08df932350db8475ef56d44c5880b877bc453b51b23706f7" Feb 27 17:54:35 crc kubenswrapper[4752]: I0227 17:54:35.058522 4752 scope.go:117] "RemoveContainer" containerID="dbf56930ca3de4579315311a6e22d81274947bed501d5a1b3ee390de4cdd2b42" Feb 27 17:54:35 crc kubenswrapper[4752]: I0227 17:54:35.094000 4752 scope.go:117] "RemoveContainer" containerID="bdf15e224d8d46f97f022d02615d985a2db2e812e673e63355ac93e57dc17632" Feb 27 17:54:35 crc kubenswrapper[4752]: E0227 17:54:35.094710 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdf15e224d8d46f97f022d02615d985a2db2e812e673e63355ac93e57dc17632\": container with ID starting with bdf15e224d8d46f97f022d02615d985a2db2e812e673e63355ac93e57dc17632 not found: ID does not exist" containerID="bdf15e224d8d46f97f022d02615d985a2db2e812e673e63355ac93e57dc17632" Feb 27 17:54:35 crc kubenswrapper[4752]: I0227 17:54:35.094777 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdf15e224d8d46f97f022d02615d985a2db2e812e673e63355ac93e57dc17632"} err="failed to get container status \"bdf15e224d8d46f97f022d02615d985a2db2e812e673e63355ac93e57dc17632\": rpc error: code = NotFound desc = could not find container \"bdf15e224d8d46f97f022d02615d985a2db2e812e673e63355ac93e57dc17632\": container with ID starting with bdf15e224d8d46f97f022d02615d985a2db2e812e673e63355ac93e57dc17632 not found: ID does not exist" Feb 27 17:54:35 crc kubenswrapper[4752]: I0227 17:54:35.094819 4752 scope.go:117] "RemoveContainer" containerID="04d8a2f54f3ff36b08df932350db8475ef56d44c5880b877bc453b51b23706f7" Feb 27 17:54:35 crc kubenswrapper[4752]: E0227 17:54:35.095321 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04d8a2f54f3ff36b08df932350db8475ef56d44c5880b877bc453b51b23706f7\": container with ID starting with 04d8a2f54f3ff36b08df932350db8475ef56d44c5880b877bc453b51b23706f7 not found: ID does not exist" containerID="04d8a2f54f3ff36b08df932350db8475ef56d44c5880b877bc453b51b23706f7" Feb 27 17:54:35 crc kubenswrapper[4752]: I0227 17:54:35.095376 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04d8a2f54f3ff36b08df932350db8475ef56d44c5880b877bc453b51b23706f7"} err="failed to get container status \"04d8a2f54f3ff36b08df932350db8475ef56d44c5880b877bc453b51b23706f7\": rpc error: code = NotFound desc = could not find container \"04d8a2f54f3ff36b08df932350db8475ef56d44c5880b877bc453b51b23706f7\": container with ID starting with 04d8a2f54f3ff36b08df932350db8475ef56d44c5880b877bc453b51b23706f7 not found: ID does not exist" Feb 27 17:54:35 crc kubenswrapper[4752]: I0227 17:54:35.095411 4752 scope.go:117] "RemoveContainer" containerID="dbf56930ca3de4579315311a6e22d81274947bed501d5a1b3ee390de4cdd2b42" Feb 27 17:54:35 crc kubenswrapper[4752]: E0227 17:54:35.095819 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbf56930ca3de4579315311a6e22d81274947bed501d5a1b3ee390de4cdd2b42\": container with ID starting with dbf56930ca3de4579315311a6e22d81274947bed501d5a1b3ee390de4cdd2b42 not found: ID does not exist" containerID="dbf56930ca3de4579315311a6e22d81274947bed501d5a1b3ee390de4cdd2b42" Feb 27 17:54:35 crc kubenswrapper[4752]: I0227 17:54:35.095841 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbf56930ca3de4579315311a6e22d81274947bed501d5a1b3ee390de4cdd2b42"} err="failed to get container status \"dbf56930ca3de4579315311a6e22d81274947bed501d5a1b3ee390de4cdd2b42\": rpc error: code = NotFound desc = could not find container \"dbf56930ca3de4579315311a6e22d81274947bed501d5a1b3ee390de4cdd2b42\": container with ID starting with dbf56930ca3de4579315311a6e22d81274947bed501d5a1b3ee390de4cdd2b42 not found: ID does not exist" Feb 27 17:54:36 crc kubenswrapper[4752]: I0227 17:54:36.914487 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a8647bb-a173-4e43-b3f4-7f16f67deff5" path="/var/lib/kubelet/pods/2a8647bb-a173-4e43-b3f4-7f16f67deff5/volumes" Feb 27 17:54:38 crc kubenswrapper[4752]: E0227 17:54:38.910137 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nmstate-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5\\\"\"" pod="openshift-nmstate/nmstate-metrics-69594cc75-l2g98" podUID="592dcc36-2b17-4a10-b182-b693490e83c7" Feb 27 17:54:43 crc kubenswrapper[4752]: E0227 17:54:43.909369 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nmstate-handler\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5\\\"\"" pod="openshift-nmstate/nmstate-handler-7wpqr" podUID="5fc4e66a-972d-4516-96ba-fa4b56a181a0" Feb 27 17:54:51 crc kubenswrapper[4752]: E0227 17:54:51.909440 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nmstate-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5\\\"\"" pod="openshift-nmstate/nmstate-metrics-69594cc75-l2g98" podUID="592dcc36-2b17-4a10-b182-b693490e83c7" Feb 27 17:54:58 crc kubenswrapper[4752]: I0227 17:54:58.910161 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 17:55:01 crc kubenswrapper[4752]: I0227 17:55:01.187421 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-7wpqr" event={"ID":"5fc4e66a-972d-4516-96ba-fa4b56a181a0","Type":"ContainerStarted","Data":"ebe0157e81ee0f993d601111b63cd013f1cec29451f38b5f4a0b7fed62e095f4"} Feb 27 17:55:01 crc kubenswrapper[4752]: I0227 17:55:01.188246 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-7wpqr" Feb 27 17:55:01 crc kubenswrapper[4752]: I0227 17:55:01.214374 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-7wpqr" podStartSLOduration=1.626373344 podStartE2EDuration="4m56.214348331s" podCreationTimestamp="2026-02-27 17:50:05 +0000 UTC" firstStartedPulling="2026-02-27 17:50:05.46099824 +0000 UTC m=+905.367815091" lastFinishedPulling="2026-02-27 17:55:00.048973217 +0000 UTC m=+1199.955790078" observedRunningTime="2026-02-27 17:55:01.213386386 +0000 UTC m=+1201.120203317" watchObservedRunningTime="2026-02-27 17:55:01.214348331 +0000 UTC m=+1201.121165212" Feb 27 17:55:05 crc kubenswrapper[4752]: I0227 17:55:05.463567 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-7wpqr" Feb 27 17:55:06 crc kubenswrapper[4752]: I0227 17:55:06.323497 4752 patch_prober.go:28] interesting pod/machine-config-daemon-cm8wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 17:55:06 crc kubenswrapper[4752]: I0227 17:55:06.323564 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 17:55:06 crc kubenswrapper[4752]: E0227 17:55:06.910779 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nmstate-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-kubernetes-nmstate-handler-rhel9@sha256:5c00ed4b5d044125b3dc619b01575e86f3955d6549ef398ccc91bbf21ceb6ad5\\\"\"" pod="openshift-nmstate/nmstate-metrics-69594cc75-l2g98" podUID="592dcc36-2b17-4a10-b182-b693490e83c7" Feb 27 17:55:21 crc kubenswrapper[4752]: I0227 17:55:21.966705 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk"] Feb 27 17:55:21 crc kubenswrapper[4752]: E0227 17:55:21.967876 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a8647bb-a173-4e43-b3f4-7f16f67deff5" containerName="extract-content" Feb 27 17:55:21 crc kubenswrapper[4752]: I0227 17:55:21.967899 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a8647bb-a173-4e43-b3f4-7f16f67deff5" containerName="extract-content" Feb 27 17:55:21 crc kubenswrapper[4752]: E0227 17:55:21.967927 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a8647bb-a173-4e43-b3f4-7f16f67deff5" containerName="registry-server" Feb 27 17:55:21 crc kubenswrapper[4752]: I0227 17:55:21.967939 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a8647bb-a173-4e43-b3f4-7f16f67deff5" containerName="registry-server" Feb 27 17:55:21 crc kubenswrapper[4752]: E0227 17:55:21.967961 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a8647bb-a173-4e43-b3f4-7f16f67deff5" containerName="extract-utilities" Feb 27 17:55:21 crc kubenswrapper[4752]: I0227 17:55:21.967976 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a8647bb-a173-4e43-b3f4-7f16f67deff5" containerName="extract-utilities" Feb 27 17:55:21 crc kubenswrapper[4752]: E0227 17:55:21.967999 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60b83e90-2de3-4492-a7fe-1593d2c90af0" containerName="oc" Feb 27 17:55:21 crc kubenswrapper[4752]: I0227 17:55:21.968014 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="60b83e90-2de3-4492-a7fe-1593d2c90af0" containerName="oc" Feb 27 17:55:21 crc kubenswrapper[4752]: I0227 17:55:21.968209 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="60b83e90-2de3-4492-a7fe-1593d2c90af0" containerName="oc" Feb 27 17:55:21 crc kubenswrapper[4752]: I0227 17:55:21.968233 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a8647bb-a173-4e43-b3f4-7f16f67deff5" containerName="registry-server" Feb 27 17:55:21 crc kubenswrapper[4752]: I0227 17:55:21.969580 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" Feb 27 17:55:21 crc kubenswrapper[4752]: I0227 17:55:21.973674 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 27 17:55:21 crc kubenswrapper[4752]: I0227 17:55:21.980827 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7bxj\" (UniqueName: \"kubernetes.io/projected/b3272ad9-e002-48b6-92b4-27da6186f45a-kube-api-access-t7bxj\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk\" (UID: \"b3272ad9-e002-48b6-92b4-27da6186f45a\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" Feb 27 17:55:21 crc kubenswrapper[4752]: I0227 17:55:21.981049 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b3272ad9-e002-48b6-92b4-27da6186f45a-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk\" (UID: \"b3272ad9-e002-48b6-92b4-27da6186f45a\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" Feb 27 17:55:21 crc kubenswrapper[4752]: I0227 17:55:21.981237 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b3272ad9-e002-48b6-92b4-27da6186f45a-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk\" (UID: \"b3272ad9-e002-48b6-92b4-27da6186f45a\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" Feb 27 17:55:21 crc kubenswrapper[4752]: I0227 17:55:21.989172 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk"] Feb 27 17:55:22 crc kubenswrapper[4752]: I0227 17:55:22.083266 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b3272ad9-e002-48b6-92b4-27da6186f45a-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk\" (UID: \"b3272ad9-e002-48b6-92b4-27da6186f45a\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" Feb 27 17:55:22 crc kubenswrapper[4752]: I0227 17:55:22.083420 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b3272ad9-e002-48b6-92b4-27da6186f45a-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk\" (UID: \"b3272ad9-e002-48b6-92b4-27da6186f45a\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" Feb 27 17:55:22 crc kubenswrapper[4752]: I0227 17:55:22.083539 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7bxj\" (UniqueName: \"kubernetes.io/projected/b3272ad9-e002-48b6-92b4-27da6186f45a-kube-api-access-t7bxj\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk\" (UID: \"b3272ad9-e002-48b6-92b4-27da6186f45a\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" Feb 27 17:55:22 crc kubenswrapper[4752]: I0227 17:55:22.084445 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b3272ad9-e002-48b6-92b4-27da6186f45a-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk\" (UID: \"b3272ad9-e002-48b6-92b4-27da6186f45a\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" Feb 27 17:55:22 crc kubenswrapper[4752]: I0227 17:55:22.085230 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b3272ad9-e002-48b6-92b4-27da6186f45a-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk\" (UID: \"b3272ad9-e002-48b6-92b4-27da6186f45a\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" Feb 27 17:55:22 crc kubenswrapper[4752]: I0227 17:55:22.111976 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7bxj\" (UniqueName: \"kubernetes.io/projected/b3272ad9-e002-48b6-92b4-27da6186f45a-kube-api-access-t7bxj\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk\" (UID: \"b3272ad9-e002-48b6-92b4-27da6186f45a\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" Feb 27 17:55:22 crc kubenswrapper[4752]: I0227 17:55:22.350256 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-l2g98" event={"ID":"592dcc36-2b17-4a10-b182-b693490e83c7","Type":"ContainerStarted","Data":"f0dec0276d5fc835ff7556804a4508b107d68f960a952cee821dc0dec3c3a243"} Feb 27 17:55:22 crc kubenswrapper[4752]: I0227 17:55:22.357705 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" Feb 27 17:55:22 crc kubenswrapper[4752]: I0227 17:55:22.376499 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-69594cc75-l2g98" podStartSLOduration=2.065598764 podStartE2EDuration="5m18.376473367s" podCreationTimestamp="2026-02-27 17:50:04 +0000 UTC" firstStartedPulling="2026-02-27 17:50:05.65196049 +0000 UTC m=+905.558777361" lastFinishedPulling="2026-02-27 17:55:21.962835083 +0000 UTC m=+1221.869651964" observedRunningTime="2026-02-27 17:55:22.372898546 +0000 UTC m=+1222.279715447" watchObservedRunningTime="2026-02-27 17:55:22.376473367 +0000 UTC m=+1222.283290268" Feb 27 17:55:22 crc kubenswrapper[4752]: I0227 17:55:22.640558 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk"] Feb 27 17:55:22 crc kubenswrapper[4752]: W0227 17:55:22.648877 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3272ad9_e002_48b6_92b4_27da6186f45a.slice/crio-12b32338c41b39322af859247419a6801e96df789d8c6be891d976d04d0cd6a8 WatchSource:0}: Error finding container 12b32338c41b39322af859247419a6801e96df789d8c6be891d976d04d0cd6a8: Status 404 returned error can't find the container with id 12b32338c41b39322af859247419a6801e96df789d8c6be891d976d04d0cd6a8 Feb 27 17:55:23 crc kubenswrapper[4752]: I0227 17:55:23.359543 4752 generic.go:334] "Generic (PLEG): container finished" podID="b3272ad9-e002-48b6-92b4-27da6186f45a" containerID="cb832e3eda5173f1ca3624730aad32d7708d56ce1aa52cb2013adfef0be97145" exitCode=0 Feb 27 17:55:23 crc kubenswrapper[4752]: I0227 17:55:23.359609 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" event={"ID":"b3272ad9-e002-48b6-92b4-27da6186f45a","Type":"ContainerDied","Data":"cb832e3eda5173f1ca3624730aad32d7708d56ce1aa52cb2013adfef0be97145"} Feb 27 17:55:23 crc kubenswrapper[4752]: I0227 17:55:23.359665 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" event={"ID":"b3272ad9-e002-48b6-92b4-27da6186f45a","Type":"ContainerStarted","Data":"12b32338c41b39322af859247419a6801e96df789d8c6be891d976d04d0cd6a8"} Feb 27 17:55:36 crc kubenswrapper[4752]: I0227 17:55:36.323914 4752 patch_prober.go:28] interesting pod/machine-config-daemon-cm8wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 17:55:36 crc kubenswrapper[4752]: I0227 17:55:36.324672 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 17:56:00 crc kubenswrapper[4752]: I0227 17:56:00.164734 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536916-h8kqw"] Feb 27 17:56:00 crc kubenswrapper[4752]: I0227 17:56:00.167368 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536916-h8kqw" Feb 27 17:56:00 crc kubenswrapper[4752]: I0227 17:56:00.169845 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 17:56:00 crc kubenswrapper[4752]: I0227 17:56:00.171987 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wtt6d" Feb 27 17:56:00 crc kubenswrapper[4752]: I0227 17:56:00.172209 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536916-h8kqw"] Feb 27 17:56:00 crc kubenswrapper[4752]: I0227 17:56:00.172311 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 17:56:00 crc kubenswrapper[4752]: I0227 17:56:00.338440 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2dfp\" (UniqueName: \"kubernetes.io/projected/ef8bcb56-ca08-4640-895c-b122c4ad2ad3-kube-api-access-p2dfp\") pod \"auto-csr-approver-29536916-h8kqw\" (UID: \"ef8bcb56-ca08-4640-895c-b122c4ad2ad3\") " pod="openshift-infra/auto-csr-approver-29536916-h8kqw" Feb 27 17:56:00 crc kubenswrapper[4752]: I0227 17:56:00.440326 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2dfp\" (UniqueName: \"kubernetes.io/projected/ef8bcb56-ca08-4640-895c-b122c4ad2ad3-kube-api-access-p2dfp\") pod \"auto-csr-approver-29536916-h8kqw\" (UID: \"ef8bcb56-ca08-4640-895c-b122c4ad2ad3\") " pod="openshift-infra/auto-csr-approver-29536916-h8kqw" Feb 27 17:56:00 crc kubenswrapper[4752]: I0227 17:56:00.473665 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2dfp\" (UniqueName: \"kubernetes.io/projected/ef8bcb56-ca08-4640-895c-b122c4ad2ad3-kube-api-access-p2dfp\") pod \"auto-csr-approver-29536916-h8kqw\" (UID: \"ef8bcb56-ca08-4640-895c-b122c4ad2ad3\") " pod="openshift-infra/auto-csr-approver-29536916-h8kqw" Feb 27 17:56:00 crc kubenswrapper[4752]: I0227 17:56:00.497384 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536916-h8kqw" Feb 27 17:56:00 crc kubenswrapper[4752]: I0227 17:56:00.729174 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536916-h8kqw"] Feb 27 17:56:01 crc kubenswrapper[4752]: I0227 17:56:01.651484 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536916-h8kqw" event={"ID":"ef8bcb56-ca08-4640-895c-b122c4ad2ad3","Type":"ContainerStarted","Data":"bc60f6f7a295c65267a99d233c165a1adcaeadd6939f0d1a430f02a3f445bd72"} Feb 27 17:56:02 crc kubenswrapper[4752]: I0227 17:56:02.659664 4752 generic.go:334] "Generic (PLEG): container finished" podID="ef8bcb56-ca08-4640-895c-b122c4ad2ad3" containerID="d3306d8c3a041cdf001cb44eef0e59d3d5f852ca2fcfc3fdc7d8b88bcdcf6883" exitCode=0 Feb 27 17:56:02 crc kubenswrapper[4752]: I0227 17:56:02.659855 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536916-h8kqw" event={"ID":"ef8bcb56-ca08-4640-895c-b122c4ad2ad3","Type":"ContainerDied","Data":"d3306d8c3a041cdf001cb44eef0e59d3d5f852ca2fcfc3fdc7d8b88bcdcf6883"} Feb 27 17:56:03 crc kubenswrapper[4752]: I0227 17:56:03.296782 4752 scope.go:117] "RemoveContainer" containerID="056b82ec81a7f51986d7257c0e67aa075fbce8d6a82e958f79db2c3bda2fe32a" Feb 27 17:56:03 crc kubenswrapper[4752]: I0227 17:56:03.980358 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536916-h8kqw" Feb 27 17:56:04 crc kubenswrapper[4752]: I0227 17:56:04.091060 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2dfp\" (UniqueName: \"kubernetes.io/projected/ef8bcb56-ca08-4640-895c-b122c4ad2ad3-kube-api-access-p2dfp\") pod \"ef8bcb56-ca08-4640-895c-b122c4ad2ad3\" (UID: \"ef8bcb56-ca08-4640-895c-b122c4ad2ad3\") " Feb 27 17:56:04 crc kubenswrapper[4752]: I0227 17:56:04.097210 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef8bcb56-ca08-4640-895c-b122c4ad2ad3-kube-api-access-p2dfp" (OuterVolumeSpecName: "kube-api-access-p2dfp") pod "ef8bcb56-ca08-4640-895c-b122c4ad2ad3" (UID: "ef8bcb56-ca08-4640-895c-b122c4ad2ad3"). InnerVolumeSpecName "kube-api-access-p2dfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:56:04 crc kubenswrapper[4752]: I0227 17:56:04.193028 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2dfp\" (UniqueName: \"kubernetes.io/projected/ef8bcb56-ca08-4640-895c-b122c4ad2ad3-kube-api-access-p2dfp\") on node \"crc\" DevicePath \"\"" Feb 27 17:56:04 crc kubenswrapper[4752]: I0227 17:56:04.677107 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536916-h8kqw" event={"ID":"ef8bcb56-ca08-4640-895c-b122c4ad2ad3","Type":"ContainerDied","Data":"bc60f6f7a295c65267a99d233c165a1adcaeadd6939f0d1a430f02a3f445bd72"} Feb 27 17:56:04 crc kubenswrapper[4752]: I0227 17:56:04.677160 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536916-h8kqw" Feb 27 17:56:04 crc kubenswrapper[4752]: I0227 17:56:04.677205 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc60f6f7a295c65267a99d233c165a1adcaeadd6939f0d1a430f02a3f445bd72" Feb 27 17:56:05 crc kubenswrapper[4752]: I0227 17:56:05.040748 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536910-lld75"] Feb 27 17:56:05 crc kubenswrapper[4752]: I0227 17:56:05.048896 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536910-lld75"] Feb 27 17:56:06 crc kubenswrapper[4752]: I0227 17:56:06.323546 4752 patch_prober.go:28] interesting pod/machine-config-daemon-cm8wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 17:56:06 crc kubenswrapper[4752]: I0227 17:56:06.324227 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 17:56:06 crc kubenswrapper[4752]: I0227 17:56:06.324364 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" Feb 27 17:56:06 crc kubenswrapper[4752]: I0227 17:56:06.325377 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4e7de8fe350c8a0eff755a90f23174bdf3c2f2e4407f67fe8eede9d946c66a13"} pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 17:56:06 crc kubenswrapper[4752]: I0227 17:56:06.325536 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" containerName="machine-config-daemon" containerID="cri-o://4e7de8fe350c8a0eff755a90f23174bdf3c2f2e4407f67fe8eede9d946c66a13" gracePeriod=600 Feb 27 17:56:06 crc kubenswrapper[4752]: I0227 17:56:06.694576 4752 generic.go:334] "Generic (PLEG): container finished" podID="53ce186c-640f-4ade-94e1-587c1440fe87" containerID="4e7de8fe350c8a0eff755a90f23174bdf3c2f2e4407f67fe8eede9d946c66a13" exitCode=0 Feb 27 17:56:06 crc kubenswrapper[4752]: I0227 17:56:06.694835 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" event={"ID":"53ce186c-640f-4ade-94e1-587c1440fe87","Type":"ContainerDied","Data":"4e7de8fe350c8a0eff755a90f23174bdf3c2f2e4407f67fe8eede9d946c66a13"} Feb 27 17:56:06 crc kubenswrapper[4752]: I0227 17:56:06.694969 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" event={"ID":"53ce186c-640f-4ade-94e1-587c1440fe87","Type":"ContainerStarted","Data":"89c7c62ab067a83d8206093b66c0f0d30eb19eaa1dee526f56d7b8a3fdbbed43"} Feb 27 17:56:06 crc kubenswrapper[4752]: I0227 17:56:06.695000 4752 scope.go:117] "RemoveContainer" containerID="9dcaf0a23a37ae06ee6e0942e328a8ecbd6e4ed2d990fbabc5b07156fcd3f846" Feb 27 17:56:06 crc kubenswrapper[4752]: I0227 17:56:06.917763 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e9fc9d7-94c5-4b1f-ab54-13183ac41df4" path="/var/lib/kubelet/pods/3e9fc9d7-94c5-4b1f-ab54-13183ac41df4/volumes" Feb 27 17:56:24 crc kubenswrapper[4752]: E0227 17:56:24.949807 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-metallb-operator-bundle@sha256=2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f/signature-11: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f" Feb 27 17:56:24 crc kubenswrapper[4752]: E0227 17:56:24.950535 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:pull,Image:registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f,Command:[/util/cpb /bundle],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:bundle,ReadOnly:false,MountPath:/bundle,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:util,ReadOnly:false,MountPath:/util,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t7bxj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk_openshift-marketplace(b3272ad9-e002-48b6-92b4-27da6186f45a): ErrImagePull: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-metallb-operator-bundle@sha256=2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f/signature-11: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 17:56:24 crc kubenswrapper[4752]: E0227 17:56:24.952000 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ErrImagePull: \"reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-metallb-operator-bundle@sha256=2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f/signature-11: status 500 (Internal Server Error)\"" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" podUID="b3272ad9-e002-48b6-92b4-27da6186f45a" Feb 27 17:56:25 crc kubenswrapper[4752]: E0227 17:56:25.841772 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f\\\"\"" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" podUID="b3272ad9-e002-48b6-92b4-27da6186f45a" Feb 27 17:56:42 crc kubenswrapper[4752]: E0227 17:56:42.047232 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-metallb-operator-bundle@sha256=2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f/signature-11: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f" Feb 27 17:56:42 crc kubenswrapper[4752]: E0227 17:56:42.048218 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:pull,Image:registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f,Command:[/util/cpb /bundle],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:bundle,ReadOnly:false,MountPath:/bundle,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:util,ReadOnly:false,MountPath:/util,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t7bxj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk_openshift-marketplace(b3272ad9-e002-48b6-92b4-27da6186f45a): ErrImagePull: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-metallb-operator-bundle@sha256=2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f/signature-11: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 17:56:42 crc kubenswrapper[4752]: E0227 17:56:42.049435 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ErrImagePull: \"reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-metallb-operator-bundle@sha256=2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f/signature-11: status 500 (Internal Server Error)\"" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" podUID="b3272ad9-e002-48b6-92b4-27da6186f45a" Feb 27 17:56:55 crc kubenswrapper[4752]: E0227 17:56:55.917871 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f\\\"\"" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" podUID="b3272ad9-e002-48b6-92b4-27da6186f45a" Feb 27 17:57:03 crc kubenswrapper[4752]: I0227 17:57:03.379769 4752 scope.go:117] "RemoveContainer" containerID="a066bde52a9e7ad432fc41bc146e4b4848a1a43df3324b8c04e110260728c084" Feb 27 17:57:07 crc kubenswrapper[4752]: E0227 17:57:07.940552 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-metallb-operator-bundle@sha256=2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f/signature-11: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f" Feb 27 17:57:07 crc kubenswrapper[4752]: E0227 17:57:07.941385 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:pull,Image:registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f,Command:[/util/cpb /bundle],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:bundle,ReadOnly:false,MountPath:/bundle,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:util,ReadOnly:false,MountPath:/util,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t7bxj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk_openshift-marketplace(b3272ad9-e002-48b6-92b4-27da6186f45a): ErrImagePull: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-metallb-operator-bundle@sha256=2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f/signature-11: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 17:57:07 crc kubenswrapper[4752]: E0227 17:57:07.942744 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ErrImagePull: \"reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-metallb-operator-bundle@sha256=2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f/signature-11: status 500 (Internal Server Error)\"" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" podUID="b3272ad9-e002-48b6-92b4-27da6186f45a" Feb 27 17:57:20 crc kubenswrapper[4752]: E0227 17:57:20.926422 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f\\\"\"" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" podUID="b3272ad9-e002-48b6-92b4-27da6186f45a" Feb 27 17:57:34 crc kubenswrapper[4752]: E0227 17:57:34.909385 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f\\\"\"" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" podUID="b3272ad9-e002-48b6-92b4-27da6186f45a" Feb 27 17:57:47 crc kubenswrapper[4752]: E0227 17:57:47.909518 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f\\\"\"" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" podUID="b3272ad9-e002-48b6-92b4-27da6186f45a" Feb 27 17:58:00 crc kubenswrapper[4752]: I0227 17:58:00.144322 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536918-ztclp"] Feb 27 17:58:00 crc kubenswrapper[4752]: E0227 17:58:00.145688 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef8bcb56-ca08-4640-895c-b122c4ad2ad3" containerName="oc" Feb 27 17:58:00 crc kubenswrapper[4752]: I0227 17:58:00.145716 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef8bcb56-ca08-4640-895c-b122c4ad2ad3" containerName="oc" Feb 27 17:58:00 crc kubenswrapper[4752]: I0227 17:58:00.145954 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef8bcb56-ca08-4640-895c-b122c4ad2ad3" containerName="oc" Feb 27 17:58:00 crc kubenswrapper[4752]: I0227 17:58:00.146874 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536918-ztclp" Feb 27 17:58:00 crc kubenswrapper[4752]: I0227 17:58:00.149030 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 17:58:00 crc kubenswrapper[4752]: I0227 17:58:00.149489 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536918-ztclp"] Feb 27 17:58:00 crc kubenswrapper[4752]: I0227 17:58:00.150315 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 17:58:00 crc kubenswrapper[4752]: I0227 17:58:00.150384 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wtt6d" Feb 27 17:58:00 crc kubenswrapper[4752]: I0227 17:58:00.178717 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvz5p\" (UniqueName: \"kubernetes.io/projected/f3cc92ab-e406-46bd-8b55-6bc73db57254-kube-api-access-jvz5p\") pod \"auto-csr-approver-29536918-ztclp\" (UID: \"f3cc92ab-e406-46bd-8b55-6bc73db57254\") " pod="openshift-infra/auto-csr-approver-29536918-ztclp" Feb 27 17:58:00 crc kubenswrapper[4752]: I0227 17:58:00.279614 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvz5p\" (UniqueName: \"kubernetes.io/projected/f3cc92ab-e406-46bd-8b55-6bc73db57254-kube-api-access-jvz5p\") pod \"auto-csr-approver-29536918-ztclp\" (UID: \"f3cc92ab-e406-46bd-8b55-6bc73db57254\") " pod="openshift-infra/auto-csr-approver-29536918-ztclp" Feb 27 17:58:00 crc kubenswrapper[4752]: I0227 17:58:00.303142 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvz5p\" (UniqueName: \"kubernetes.io/projected/f3cc92ab-e406-46bd-8b55-6bc73db57254-kube-api-access-jvz5p\") pod \"auto-csr-approver-29536918-ztclp\" (UID: \"f3cc92ab-e406-46bd-8b55-6bc73db57254\") " pod="openshift-infra/auto-csr-approver-29536918-ztclp" Feb 27 17:58:00 crc kubenswrapper[4752]: I0227 17:58:00.468693 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536918-ztclp" Feb 27 17:58:00 crc kubenswrapper[4752]: I0227 17:58:00.787574 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536918-ztclp"] Feb 27 17:58:00 crc kubenswrapper[4752]: W0227 17:58:00.798373 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3cc92ab_e406_46bd_8b55_6bc73db57254.slice/crio-297e8c424ea997fe12183110a70cf77319b4a95345949723650c24b690afd8e6 WatchSource:0}: Error finding container 297e8c424ea997fe12183110a70cf77319b4a95345949723650c24b690afd8e6: Status 404 returned error can't find the container with id 297e8c424ea997fe12183110a70cf77319b4a95345949723650c24b690afd8e6 Feb 27 17:58:01 crc kubenswrapper[4752]: I0227 17:58:01.522533 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536918-ztclp" event={"ID":"f3cc92ab-e406-46bd-8b55-6bc73db57254","Type":"ContainerStarted","Data":"297e8c424ea997fe12183110a70cf77319b4a95345949723650c24b690afd8e6"} Feb 27 17:58:02 crc kubenswrapper[4752]: E0227 17:58:02.140065 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 27 17:58:02 crc kubenswrapper[4752]: E0227 17:58:02.140273 4752 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 17:58:02 crc kubenswrapper[4752]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 27 17:58:02 crc kubenswrapper[4752]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jvz5p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29536918-ztclp_openshift-infra(f3cc92ab-e406-46bd-8b55-6bc73db57254): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error) Feb 27 17:58:02 crc kubenswrapper[4752]: > logger="UnhandledError" Feb 27 17:58:02 crc kubenswrapper[4752]: E0227 17:58:02.141664 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)\"" pod="openshift-infra/auto-csr-approver-29536918-ztclp" podUID="f3cc92ab-e406-46bd-8b55-6bc73db57254" Feb 27 17:58:02 crc kubenswrapper[4752]: E0227 17:58:02.536036 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536918-ztclp" podUID="f3cc92ab-e406-46bd-8b55-6bc73db57254" Feb 27 17:58:06 crc kubenswrapper[4752]: I0227 17:58:06.323110 4752 patch_prober.go:28] interesting pod/machine-config-daemon-cm8wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 17:58:06 crc kubenswrapper[4752]: I0227 17:58:06.323626 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 17:58:14 crc kubenswrapper[4752]: I0227 17:58:14.633701 4752 generic.go:334] "Generic (PLEG): container finished" podID="f3cc92ab-e406-46bd-8b55-6bc73db57254" containerID="1726d22bc08540dbaf8bee13b3890888ef4b1c1533e6f26bd9da7a3f0034e027" exitCode=0 Feb 27 17:58:14 crc kubenswrapper[4752]: I0227 17:58:14.634001 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536918-ztclp" event={"ID":"f3cc92ab-e406-46bd-8b55-6bc73db57254","Type":"ContainerDied","Data":"1726d22bc08540dbaf8bee13b3890888ef4b1c1533e6f26bd9da7a3f0034e027"} Feb 27 17:58:15 crc kubenswrapper[4752]: I0227 17:58:15.995877 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536918-ztclp" Feb 27 17:58:16 crc kubenswrapper[4752]: I0227 17:58:16.024300 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvz5p\" (UniqueName: \"kubernetes.io/projected/f3cc92ab-e406-46bd-8b55-6bc73db57254-kube-api-access-jvz5p\") pod \"f3cc92ab-e406-46bd-8b55-6bc73db57254\" (UID: \"f3cc92ab-e406-46bd-8b55-6bc73db57254\") " Feb 27 17:58:16 crc kubenswrapper[4752]: I0227 17:58:16.035719 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3cc92ab-e406-46bd-8b55-6bc73db57254-kube-api-access-jvz5p" (OuterVolumeSpecName: "kube-api-access-jvz5p") pod "f3cc92ab-e406-46bd-8b55-6bc73db57254" (UID: "f3cc92ab-e406-46bd-8b55-6bc73db57254"). InnerVolumeSpecName "kube-api-access-jvz5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 17:58:16 crc kubenswrapper[4752]: I0227 17:58:16.126981 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvz5p\" (UniqueName: \"kubernetes.io/projected/f3cc92ab-e406-46bd-8b55-6bc73db57254-kube-api-access-jvz5p\") on node \"crc\" DevicePath \"\"" Feb 27 17:58:16 crc kubenswrapper[4752]: I0227 17:58:16.650291 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536918-ztclp" event={"ID":"f3cc92ab-e406-46bd-8b55-6bc73db57254","Type":"ContainerDied","Data":"297e8c424ea997fe12183110a70cf77319b4a95345949723650c24b690afd8e6"} Feb 27 17:58:16 crc kubenswrapper[4752]: I0227 17:58:16.650686 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="297e8c424ea997fe12183110a70cf77319b4a95345949723650c24b690afd8e6" Feb 27 17:58:16 crc kubenswrapper[4752]: I0227 17:58:16.650359 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536918-ztclp" Feb 27 17:58:17 crc kubenswrapper[4752]: I0227 17:58:17.077654 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536912-zr29z"] Feb 27 17:58:17 crc kubenswrapper[4752]: I0227 17:58:17.092327 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536912-zr29z"] Feb 27 17:58:18 crc kubenswrapper[4752]: I0227 17:58:18.923301 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b753d203-d3cb-4304-b753-4ec10344426b" path="/var/lib/kubelet/pods/b753d203-d3cb-4304-b753-4ec10344426b/volumes" Feb 27 17:58:36 crc kubenswrapper[4752]: I0227 17:58:36.324238 4752 patch_prober.go:28] interesting pod/machine-config-daemon-cm8wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 17:58:36 crc kubenswrapper[4752]: I0227 17:58:36.324633 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 17:58:46 crc kubenswrapper[4752]: E0227 17:58:46.969616 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-metallb-operator-bundle@sha256=2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f/signature-11: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f" Feb 27 17:58:46 crc kubenswrapper[4752]: E0227 17:58:46.970399 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:pull,Image:registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f,Command:[/util/cpb /bundle],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:bundle,ReadOnly:false,MountPath:/bundle,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:util,ReadOnly:false,MountPath:/util,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t7bxj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk_openshift-marketplace(b3272ad9-e002-48b6-92b4-27da6186f45a): ErrImagePull: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-metallb-operator-bundle@sha256=2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f/signature-11: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 17:58:46 crc kubenswrapper[4752]: E0227 17:58:46.971628 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ErrImagePull: \"reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-metallb-operator-bundle@sha256=2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f/signature-11: status 500 (Internal Server Error)\"" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" podUID="b3272ad9-e002-48b6-92b4-27da6186f45a" Feb 27 17:59:00 crc kubenswrapper[4752]: E0227 17:59:00.917357 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f\\\"\"" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" podUID="b3272ad9-e002-48b6-92b4-27da6186f45a" Feb 27 17:59:03 crc kubenswrapper[4752]: I0227 17:59:03.475287 4752 scope.go:117] "RemoveContainer" containerID="3ddad92e6bc219d176dca73e2ebedc05464e9d25492781347bddf6b670a59fcf" Feb 27 17:59:06 crc kubenswrapper[4752]: I0227 17:59:06.323783 4752 patch_prober.go:28] interesting pod/machine-config-daemon-cm8wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 17:59:06 crc kubenswrapper[4752]: I0227 17:59:06.325641 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 17:59:06 crc kubenswrapper[4752]: I0227 17:59:06.325848 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" Feb 27 17:59:06 crc kubenswrapper[4752]: I0227 17:59:06.326788 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"89c7c62ab067a83d8206093b66c0f0d30eb19eaa1dee526f56d7b8a3fdbbed43"} pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 17:59:06 crc kubenswrapper[4752]: I0227 17:59:06.327049 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" containerName="machine-config-daemon" containerID="cri-o://89c7c62ab067a83d8206093b66c0f0d30eb19eaa1dee526f56d7b8a3fdbbed43" gracePeriod=600 Feb 27 17:59:06 crc kubenswrapper[4752]: I0227 17:59:06.583645 4752 generic.go:334] "Generic (PLEG): container finished" podID="53ce186c-640f-4ade-94e1-587c1440fe87" containerID="89c7c62ab067a83d8206093b66c0f0d30eb19eaa1dee526f56d7b8a3fdbbed43" exitCode=0 Feb 27 17:59:06 crc kubenswrapper[4752]: I0227 17:59:06.583750 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" event={"ID":"53ce186c-640f-4ade-94e1-587c1440fe87","Type":"ContainerDied","Data":"89c7c62ab067a83d8206093b66c0f0d30eb19eaa1dee526f56d7b8a3fdbbed43"} Feb 27 17:59:06 crc kubenswrapper[4752]: I0227 17:59:06.584023 4752 scope.go:117] "RemoveContainer" containerID="4e7de8fe350c8a0eff755a90f23174bdf3c2f2e4407f67fe8eede9d946c66a13" Feb 27 17:59:07 crc kubenswrapper[4752]: I0227 17:59:07.594009 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" event={"ID":"53ce186c-640f-4ade-94e1-587c1440fe87","Type":"ContainerStarted","Data":"2d15bd23de76affad936b2b453b80049294c778d3446231b8cd3ac1c0f04c31a"} Feb 27 17:59:13 crc kubenswrapper[4752]: E0227 17:59:13.910666 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f\\\"\"" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" podUID="b3272ad9-e002-48b6-92b4-27da6186f45a" Feb 27 17:59:27 crc kubenswrapper[4752]: E0227 17:59:27.910360 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f\\\"\"" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" podUID="b3272ad9-e002-48b6-92b4-27da6186f45a" Feb 27 17:59:40 crc kubenswrapper[4752]: E0227 17:59:40.911390 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f\\\"\"" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" podUID="b3272ad9-e002-48b6-92b4-27da6186f45a" Feb 27 17:59:55 crc kubenswrapper[4752]: E0227 17:59:55.909621 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f\\\"\"" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" podUID="b3272ad9-e002-48b6-92b4-27da6186f45a" Feb 27 18:00:00 crc kubenswrapper[4752]: I0227 18:00:00.154376 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536920-tkl55"] Feb 27 18:00:00 crc kubenswrapper[4752]: E0227 18:00:00.155383 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3cc92ab-e406-46bd-8b55-6bc73db57254" containerName="oc" Feb 27 18:00:00 crc kubenswrapper[4752]: I0227 18:00:00.155409 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3cc92ab-e406-46bd-8b55-6bc73db57254" containerName="oc" Feb 27 18:00:00 crc kubenswrapper[4752]: I0227 18:00:00.155590 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3cc92ab-e406-46bd-8b55-6bc73db57254" containerName="oc" Feb 27 18:00:00 crc kubenswrapper[4752]: I0227 18:00:00.156274 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536920-tkl55" Feb 27 18:00:00 crc kubenswrapper[4752]: I0227 18:00:00.158373 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536920-bj78q"] Feb 27 18:00:00 crc kubenswrapper[4752]: I0227 18:00:00.159024 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 18:00:00 crc kubenswrapper[4752]: I0227 18:00:00.159410 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wtt6d" Feb 27 18:00:00 crc kubenswrapper[4752]: I0227 18:00:00.159191 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536920-bj78q" Feb 27 18:00:00 crc kubenswrapper[4752]: I0227 18:00:00.160398 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 18:00:00 crc kubenswrapper[4752]: I0227 18:00:00.162606 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 27 18:00:00 crc kubenswrapper[4752]: I0227 18:00:00.164017 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 27 18:00:00 crc kubenswrapper[4752]: I0227 18:00:00.173323 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536920-tkl55"] Feb 27 18:00:00 crc kubenswrapper[4752]: I0227 18:00:00.183203 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536920-bj78q"] Feb 27 18:00:00 crc kubenswrapper[4752]: I0227 18:00:00.247381 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b2793737-701c-4d12-a9c1-d43d39dce4ea-secret-volume\") pod \"collect-profiles-29536920-bj78q\" (UID: \"b2793737-701c-4d12-a9c1-d43d39dce4ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536920-bj78q" Feb 27 18:00:00 crc kubenswrapper[4752]: I0227 18:00:00.247422 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgj9t\" (UniqueName: \"kubernetes.io/projected/395fd68e-6845-4875-b0a0-de3c408b5fca-kube-api-access-sgj9t\") pod \"auto-csr-approver-29536920-tkl55\" (UID: \"395fd68e-6845-4875-b0a0-de3c408b5fca\") " pod="openshift-infra/auto-csr-approver-29536920-tkl55" Feb 27 18:00:00 crc kubenswrapper[4752]: I0227 18:00:00.247446 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2793737-701c-4d12-a9c1-d43d39dce4ea-config-volume\") pod \"collect-profiles-29536920-bj78q\" (UID: \"b2793737-701c-4d12-a9c1-d43d39dce4ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536920-bj78q" Feb 27 18:00:00 crc kubenswrapper[4752]: I0227 18:00:00.247484 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gj5b\" (UniqueName: \"kubernetes.io/projected/b2793737-701c-4d12-a9c1-d43d39dce4ea-kube-api-access-8gj5b\") pod \"collect-profiles-29536920-bj78q\" (UID: \"b2793737-701c-4d12-a9c1-d43d39dce4ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536920-bj78q" Feb 27 18:00:00 crc kubenswrapper[4752]: I0227 18:00:00.348548 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gj5b\" (UniqueName: \"kubernetes.io/projected/b2793737-701c-4d12-a9c1-d43d39dce4ea-kube-api-access-8gj5b\") pod \"collect-profiles-29536920-bj78q\" (UID: \"b2793737-701c-4d12-a9c1-d43d39dce4ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536920-bj78q" Feb 27 18:00:00 crc kubenswrapper[4752]: I0227 18:00:00.348750 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b2793737-701c-4d12-a9c1-d43d39dce4ea-secret-volume\") pod \"collect-profiles-29536920-bj78q\" (UID: \"b2793737-701c-4d12-a9c1-d43d39dce4ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536920-bj78q" Feb 27 18:00:00 crc kubenswrapper[4752]: I0227 18:00:00.348800 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgj9t\" (UniqueName: \"kubernetes.io/projected/395fd68e-6845-4875-b0a0-de3c408b5fca-kube-api-access-sgj9t\") pod \"auto-csr-approver-29536920-tkl55\" (UID: \"395fd68e-6845-4875-b0a0-de3c408b5fca\") " pod="openshift-infra/auto-csr-approver-29536920-tkl55" Feb 27 18:00:00 crc kubenswrapper[4752]: I0227 18:00:00.348850 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2793737-701c-4d12-a9c1-d43d39dce4ea-config-volume\") pod \"collect-profiles-29536920-bj78q\" (UID: \"b2793737-701c-4d12-a9c1-d43d39dce4ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536920-bj78q" Feb 27 18:00:00 crc kubenswrapper[4752]: I0227 18:00:00.350921 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2793737-701c-4d12-a9c1-d43d39dce4ea-config-volume\") pod \"collect-profiles-29536920-bj78q\" (UID: \"b2793737-701c-4d12-a9c1-d43d39dce4ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536920-bj78q" Feb 27 18:00:00 crc kubenswrapper[4752]: I0227 18:00:00.362445 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b2793737-701c-4d12-a9c1-d43d39dce4ea-secret-volume\") pod \"collect-profiles-29536920-bj78q\" (UID: \"b2793737-701c-4d12-a9c1-d43d39dce4ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536920-bj78q" Feb 27 18:00:00 crc kubenswrapper[4752]: I0227 18:00:00.377956 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgj9t\" (UniqueName: \"kubernetes.io/projected/395fd68e-6845-4875-b0a0-de3c408b5fca-kube-api-access-sgj9t\") pod \"auto-csr-approver-29536920-tkl55\" (UID: \"395fd68e-6845-4875-b0a0-de3c408b5fca\") " pod="openshift-infra/auto-csr-approver-29536920-tkl55" Feb 27 18:00:00 crc kubenswrapper[4752]: I0227 18:00:00.378052 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gj5b\" (UniqueName: \"kubernetes.io/projected/b2793737-701c-4d12-a9c1-d43d39dce4ea-kube-api-access-8gj5b\") pod \"collect-profiles-29536920-bj78q\" (UID: \"b2793737-701c-4d12-a9c1-d43d39dce4ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29536920-bj78q" Feb 27 18:00:00 crc kubenswrapper[4752]: I0227 18:00:00.487343 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536920-tkl55" Feb 27 18:00:00 crc kubenswrapper[4752]: I0227 18:00:00.506928 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536920-bj78q" Feb 27 18:00:00 crc kubenswrapper[4752]: I0227 18:00:00.935773 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536920-tkl55"] Feb 27 18:00:00 crc kubenswrapper[4752]: I0227 18:00:00.946230 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 18:00:00 crc kubenswrapper[4752]: I0227 18:00:00.971786 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29536920-bj78q"] Feb 27 18:00:00 crc kubenswrapper[4752]: I0227 18:00:00.976125 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536920-tkl55" event={"ID":"395fd68e-6845-4875-b0a0-de3c408b5fca","Type":"ContainerStarted","Data":"ddc8c238a631a3b21644c9e99235f878f3bfc77759e3d11ab12452b82597cf7d"} Feb 27 18:00:00 crc kubenswrapper[4752]: W0227 18:00:00.979988 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2793737_701c_4d12_a9c1_d43d39dce4ea.slice/crio-fe99bc9581562d896489c317eccd12327dac5518120641c582f56d309caebddc WatchSource:0}: Error finding container fe99bc9581562d896489c317eccd12327dac5518120641c582f56d309caebddc: Status 404 returned error can't find the container with id fe99bc9581562d896489c317eccd12327dac5518120641c582f56d309caebddc Feb 27 18:00:01 crc kubenswrapper[4752]: I0227 18:00:01.986460 4752 generic.go:334] "Generic (PLEG): container finished" podID="b2793737-701c-4d12-a9c1-d43d39dce4ea" containerID="26fcacaf1ba2907a73a2b9a415964f550b772caab715487edc26de97c044e279" exitCode=0 Feb 27 18:00:01 crc kubenswrapper[4752]: I0227 18:00:01.986533 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29536920-bj78q" event={"ID":"b2793737-701c-4d12-a9c1-d43d39dce4ea","Type":"ContainerDied","Data":"26fcacaf1ba2907a73a2b9a415964f550b772caab715487edc26de97c044e279"} Feb 27 18:00:01 crc kubenswrapper[4752]: I0227 18:00:01.986983 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29536920-bj78q" event={"ID":"b2793737-701c-4d12-a9c1-d43d39dce4ea","Type":"ContainerStarted","Data":"fe99bc9581562d896489c317eccd12327dac5518120641c582f56d309caebddc"} Feb 27 18:00:03 crc kubenswrapper[4752]: I0227 18:00:03.278972 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536920-bj78q" Feb 27 18:00:03 crc kubenswrapper[4752]: I0227 18:00:03.291433 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2793737-701c-4d12-a9c1-d43d39dce4ea-config-volume\") pod \"b2793737-701c-4d12-a9c1-d43d39dce4ea\" (UID: \"b2793737-701c-4d12-a9c1-d43d39dce4ea\") " Feb 27 18:00:03 crc kubenswrapper[4752]: I0227 18:00:03.291543 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b2793737-701c-4d12-a9c1-d43d39dce4ea-secret-volume\") pod \"b2793737-701c-4d12-a9c1-d43d39dce4ea\" (UID: \"b2793737-701c-4d12-a9c1-d43d39dce4ea\") " Feb 27 18:00:03 crc kubenswrapper[4752]: I0227 18:00:03.291591 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gj5b\" (UniqueName: \"kubernetes.io/projected/b2793737-701c-4d12-a9c1-d43d39dce4ea-kube-api-access-8gj5b\") pod \"b2793737-701c-4d12-a9c1-d43d39dce4ea\" (UID: \"b2793737-701c-4d12-a9c1-d43d39dce4ea\") " Feb 27 18:00:03 crc kubenswrapper[4752]: I0227 18:00:03.292488 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2793737-701c-4d12-a9c1-d43d39dce4ea-config-volume" (OuterVolumeSpecName: "config-volume") pod "b2793737-701c-4d12-a9c1-d43d39dce4ea" (UID: "b2793737-701c-4d12-a9c1-d43d39dce4ea"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 27 18:00:03 crc kubenswrapper[4752]: I0227 18:00:03.332810 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2793737-701c-4d12-a9c1-d43d39dce4ea-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b2793737-701c-4d12-a9c1-d43d39dce4ea" (UID: "b2793737-701c-4d12-a9c1-d43d39dce4ea"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 27 18:00:03 crc kubenswrapper[4752]: I0227 18:00:03.332927 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2793737-701c-4d12-a9c1-d43d39dce4ea-kube-api-access-8gj5b" (OuterVolumeSpecName: "kube-api-access-8gj5b") pod "b2793737-701c-4d12-a9c1-d43d39dce4ea" (UID: "b2793737-701c-4d12-a9c1-d43d39dce4ea"). InnerVolumeSpecName "kube-api-access-8gj5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 18:00:03 crc kubenswrapper[4752]: I0227 18:00:03.393130 4752 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2793737-701c-4d12-a9c1-d43d39dce4ea-config-volume\") on node \"crc\" DevicePath \"\"" Feb 27 18:00:03 crc kubenswrapper[4752]: I0227 18:00:03.393207 4752 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b2793737-701c-4d12-a9c1-d43d39dce4ea-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 27 18:00:03 crc kubenswrapper[4752]: I0227 18:00:03.393226 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gj5b\" (UniqueName: \"kubernetes.io/projected/b2793737-701c-4d12-a9c1-d43d39dce4ea-kube-api-access-8gj5b\") on node \"crc\" DevicePath \"\"" Feb 27 18:00:04 crc kubenswrapper[4752]: I0227 18:00:04.004008 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29536920-bj78q" Feb 27 18:00:04 crc kubenswrapper[4752]: I0227 18:00:04.003858 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29536920-bj78q" event={"ID":"b2793737-701c-4d12-a9c1-d43d39dce4ea","Type":"ContainerDied","Data":"fe99bc9581562d896489c317eccd12327dac5518120641c582f56d309caebddc"} Feb 27 18:00:04 crc kubenswrapper[4752]: I0227 18:00:04.006303 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe99bc9581562d896489c317eccd12327dac5518120641c582f56d309caebddc" Feb 27 18:00:06 crc kubenswrapper[4752]: I0227 18:00:06.021102 4752 generic.go:334] "Generic (PLEG): container finished" podID="395fd68e-6845-4875-b0a0-de3c408b5fca" containerID="f6718d423c5b5f9c3221eab8ee92e3b4f38624e012e229e14c7df83132795756" exitCode=0 Feb 27 18:00:06 crc kubenswrapper[4752]: I0227 18:00:06.021300 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536920-tkl55" event={"ID":"395fd68e-6845-4875-b0a0-de3c408b5fca","Type":"ContainerDied","Data":"f6718d423c5b5f9c3221eab8ee92e3b4f38624e012e229e14c7df83132795756"} Feb 27 18:00:07 crc kubenswrapper[4752]: I0227 18:00:07.325535 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536920-tkl55" Feb 27 18:00:07 crc kubenswrapper[4752]: I0227 18:00:07.345511 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgj9t\" (UniqueName: \"kubernetes.io/projected/395fd68e-6845-4875-b0a0-de3c408b5fca-kube-api-access-sgj9t\") pod \"395fd68e-6845-4875-b0a0-de3c408b5fca\" (UID: \"395fd68e-6845-4875-b0a0-de3c408b5fca\") " Feb 27 18:00:07 crc kubenswrapper[4752]: I0227 18:00:07.397313 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/395fd68e-6845-4875-b0a0-de3c408b5fca-kube-api-access-sgj9t" (OuterVolumeSpecName: "kube-api-access-sgj9t") pod "395fd68e-6845-4875-b0a0-de3c408b5fca" (UID: "395fd68e-6845-4875-b0a0-de3c408b5fca"). InnerVolumeSpecName "kube-api-access-sgj9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 18:00:07 crc kubenswrapper[4752]: I0227 18:00:07.447667 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgj9t\" (UniqueName: \"kubernetes.io/projected/395fd68e-6845-4875-b0a0-de3c408b5fca-kube-api-access-sgj9t\") on node \"crc\" DevicePath \"\"" Feb 27 18:00:08 crc kubenswrapper[4752]: I0227 18:00:08.034156 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536920-tkl55" event={"ID":"395fd68e-6845-4875-b0a0-de3c408b5fca","Type":"ContainerDied","Data":"ddc8c238a631a3b21644c9e99235f878f3bfc77759e3d11ab12452b82597cf7d"} Feb 27 18:00:08 crc kubenswrapper[4752]: I0227 18:00:08.034421 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddc8c238a631a3b21644c9e99235f878f3bfc77759e3d11ab12452b82597cf7d" Feb 27 18:00:08 crc kubenswrapper[4752]: I0227 18:00:08.034206 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536920-tkl55" Feb 27 18:00:08 crc kubenswrapper[4752]: I0227 18:00:08.383869 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536914-6jdgn"] Feb 27 18:00:08 crc kubenswrapper[4752]: I0227 18:00:08.388342 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536914-6jdgn"] Feb 27 18:00:08 crc kubenswrapper[4752]: I0227 18:00:08.919258 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60b83e90-2de3-4492-a7fe-1593d2c90af0" path="/var/lib/kubelet/pods/60b83e90-2de3-4492-a7fe-1593d2c90af0/volumes" Feb 27 18:00:50 crc kubenswrapper[4752]: I0227 18:00:50.438686 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w65lq"] Feb 27 18:00:50 crc kubenswrapper[4752]: E0227 18:00:50.439566 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="395fd68e-6845-4875-b0a0-de3c408b5fca" containerName="oc" Feb 27 18:00:50 crc kubenswrapper[4752]: I0227 18:00:50.439583 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="395fd68e-6845-4875-b0a0-de3c408b5fca" containerName="oc" Feb 27 18:00:50 crc kubenswrapper[4752]: E0227 18:00:50.439599 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2793737-701c-4d12-a9c1-d43d39dce4ea" containerName="collect-profiles" Feb 27 18:00:50 crc kubenswrapper[4752]: I0227 18:00:50.439608 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2793737-701c-4d12-a9c1-d43d39dce4ea" containerName="collect-profiles" Feb 27 18:00:50 crc kubenswrapper[4752]: I0227 18:00:50.439786 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="395fd68e-6845-4875-b0a0-de3c408b5fca" containerName="oc" Feb 27 18:00:50 crc kubenswrapper[4752]: I0227 18:00:50.439807 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2793737-701c-4d12-a9c1-d43d39dce4ea" containerName="collect-profiles" Feb 27 18:00:50 crc kubenswrapper[4752]: I0227 18:00:50.441105 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w65lq" Feb 27 18:00:50 crc kubenswrapper[4752]: I0227 18:00:50.467141 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w65lq"] Feb 27 18:00:50 crc kubenswrapper[4752]: I0227 18:00:50.538372 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzkkx\" (UniqueName: \"kubernetes.io/projected/55510274-59d6-4e9b-9917-aba8615e66a1-kube-api-access-lzkkx\") pod \"redhat-operators-w65lq\" (UID: \"55510274-59d6-4e9b-9917-aba8615e66a1\") " pod="openshift-marketplace/redhat-operators-w65lq" Feb 27 18:00:50 crc kubenswrapper[4752]: I0227 18:00:50.538420 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55510274-59d6-4e9b-9917-aba8615e66a1-utilities\") pod \"redhat-operators-w65lq\" (UID: \"55510274-59d6-4e9b-9917-aba8615e66a1\") " pod="openshift-marketplace/redhat-operators-w65lq" Feb 27 18:00:50 crc kubenswrapper[4752]: I0227 18:00:50.538481 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55510274-59d6-4e9b-9917-aba8615e66a1-catalog-content\") pod \"redhat-operators-w65lq\" (UID: \"55510274-59d6-4e9b-9917-aba8615e66a1\") " pod="openshift-marketplace/redhat-operators-w65lq" Feb 27 18:00:50 crc kubenswrapper[4752]: I0227 18:00:50.639221 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55510274-59d6-4e9b-9917-aba8615e66a1-utilities\") pod \"redhat-operators-w65lq\" (UID: \"55510274-59d6-4e9b-9917-aba8615e66a1\") " pod="openshift-marketplace/redhat-operators-w65lq" Feb 27 18:00:50 crc kubenswrapper[4752]: I0227 18:00:50.639290 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55510274-59d6-4e9b-9917-aba8615e66a1-catalog-content\") pod \"redhat-operators-w65lq\" (UID: \"55510274-59d6-4e9b-9917-aba8615e66a1\") " pod="openshift-marketplace/redhat-operators-w65lq" Feb 27 18:00:50 crc kubenswrapper[4752]: I0227 18:00:50.639388 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzkkx\" (UniqueName: \"kubernetes.io/projected/55510274-59d6-4e9b-9917-aba8615e66a1-kube-api-access-lzkkx\") pod \"redhat-operators-w65lq\" (UID: \"55510274-59d6-4e9b-9917-aba8615e66a1\") " pod="openshift-marketplace/redhat-operators-w65lq" Feb 27 18:00:50 crc kubenswrapper[4752]: I0227 18:00:50.639966 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55510274-59d6-4e9b-9917-aba8615e66a1-utilities\") pod \"redhat-operators-w65lq\" (UID: \"55510274-59d6-4e9b-9917-aba8615e66a1\") " pod="openshift-marketplace/redhat-operators-w65lq" Feb 27 18:00:50 crc kubenswrapper[4752]: I0227 18:00:50.640014 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55510274-59d6-4e9b-9917-aba8615e66a1-catalog-content\") pod \"redhat-operators-w65lq\" (UID: \"55510274-59d6-4e9b-9917-aba8615e66a1\") " pod="openshift-marketplace/redhat-operators-w65lq" Feb 27 18:00:50 crc kubenswrapper[4752]: I0227 18:00:50.667534 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzkkx\" (UniqueName: \"kubernetes.io/projected/55510274-59d6-4e9b-9917-aba8615e66a1-kube-api-access-lzkkx\") pod \"redhat-operators-w65lq\" (UID: \"55510274-59d6-4e9b-9917-aba8615e66a1\") " pod="openshift-marketplace/redhat-operators-w65lq" Feb 27 18:00:50 crc kubenswrapper[4752]: I0227 18:00:50.769232 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w65lq" Feb 27 18:00:51 crc kubenswrapper[4752]: I0227 18:00:51.000113 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w65lq"] Feb 27 18:00:51 crc kubenswrapper[4752]: I0227 18:00:51.359810 4752 generic.go:334] "Generic (PLEG): container finished" podID="55510274-59d6-4e9b-9917-aba8615e66a1" containerID="17ba22dfcfa911d57ac49a4667e567ff02596899497cd786bef45f69cf5df41f" exitCode=0 Feb 27 18:00:51 crc kubenswrapper[4752]: I0227 18:00:51.360230 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w65lq" event={"ID":"55510274-59d6-4e9b-9917-aba8615e66a1","Type":"ContainerDied","Data":"17ba22dfcfa911d57ac49a4667e567ff02596899497cd786bef45f69cf5df41f"} Feb 27 18:00:51 crc kubenswrapper[4752]: I0227 18:00:51.360427 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w65lq" event={"ID":"55510274-59d6-4e9b-9917-aba8615e66a1","Type":"ContainerStarted","Data":"40145410ac6dda3caa0c92e11ceac753ce9c1b00b7cb73b5e48527eb13e18e45"} Feb 27 18:00:53 crc kubenswrapper[4752]: E0227 18:00:53.327853 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-operator-index@sha256=340dbaa786c584e5ffe05a0f79571b9c2fe7d16a1a1fb390e5d83b437d7a1ff3/signature-3: status 500 (Internal Server Error)" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 27 18:00:53 crc kubenswrapper[4752]: E0227 18:00:53.328517 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lzkkx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-w65lq_openshift-marketplace(55510274-59d6-4e9b-9917-aba8615e66a1): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-operator-index@sha256=340dbaa786c584e5ffe05a0f79571b9c2fe7d16a1a1fb390e5d83b437d7a1ff3/signature-3: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 18:00:53 crc kubenswrapper[4752]: E0227 18:00:53.329840 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-operator-index@sha256=340dbaa786c584e5ffe05a0f79571b9c2fe7d16a1a1fb390e5d83b437d7a1ff3/signature-3: status 500 (Internal Server Error)\"" pod="openshift-marketplace/redhat-operators-w65lq" podUID="55510274-59d6-4e9b-9917-aba8615e66a1" Feb 27 18:00:53 crc kubenswrapper[4752]: E0227 18:00:53.377818 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-w65lq" podUID="55510274-59d6-4e9b-9917-aba8615e66a1" Feb 27 18:01:03 crc kubenswrapper[4752]: I0227 18:01:03.556679 4752 scope.go:117] "RemoveContainer" containerID="735aa090e8a47d3db26ba5d3f5cbd42dcbc66508994cd9cb5bd0284908ef198d" Feb 27 18:01:06 crc kubenswrapper[4752]: I0227 18:01:06.323813 4752 patch_prober.go:28] interesting pod/machine-config-daemon-cm8wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 18:01:06 crc kubenswrapper[4752]: I0227 18:01:06.324410 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 18:01:08 crc kubenswrapper[4752]: E0227 18:01:08.191290 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-operator-index@sha256=340dbaa786c584e5ffe05a0f79571b9c2fe7d16a1a1fb390e5d83b437d7a1ff3/signature-3: status 500 (Internal Server Error)" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 27 18:01:08 crc kubenswrapper[4752]: E0227 18:01:08.191976 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lzkkx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-w65lq_openshift-marketplace(55510274-59d6-4e9b-9917-aba8615e66a1): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-operator-index@sha256=340dbaa786c584e5ffe05a0f79571b9c2fe7d16a1a1fb390e5d83b437d7a1ff3/signature-3: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 18:01:08 crc kubenswrapper[4752]: E0227 18:01:08.193219 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-operator-index@sha256=340dbaa786c584e5ffe05a0f79571b9c2fe7d16a1a1fb390e5d83b437d7a1ff3/signature-3: status 500 (Internal Server Error)\"" pod="openshift-marketplace/redhat-operators-w65lq" podUID="55510274-59d6-4e9b-9917-aba8615e66a1" Feb 27 18:01:12 crc kubenswrapper[4752]: E0227 18:01:12.181673 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-metallb-operator-bundle@sha256=2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f/signature-11: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f" Feb 27 18:01:12 crc kubenswrapper[4752]: E0227 18:01:12.183279 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:pull,Image:registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f,Command:[/util/cpb /bundle],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:bundle,ReadOnly:false,MountPath:/bundle,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:util,ReadOnly:false,MountPath:/util,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t7bxj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk_openshift-marketplace(b3272ad9-e002-48b6-92b4-27da6186f45a): ErrImagePull: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-metallb-operator-bundle@sha256=2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f/signature-11: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 18:01:12 crc kubenswrapper[4752]: E0227 18:01:12.184499 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ErrImagePull: \"reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-metallb-operator-bundle@sha256=2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f/signature-11: status 500 (Internal Server Error)\"" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" podUID="b3272ad9-e002-48b6-92b4-27da6186f45a" Feb 27 18:01:18 crc kubenswrapper[4752]: E0227 18:01:18.910164 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-w65lq" podUID="55510274-59d6-4e9b-9917-aba8615e66a1" Feb 27 18:01:23 crc kubenswrapper[4752]: E0227 18:01:23.910286 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f\\\"\"" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" podUID="b3272ad9-e002-48b6-92b4-27da6186f45a" Feb 27 18:01:33 crc kubenswrapper[4752]: E0227 18:01:33.651788 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-operator-index@sha256=340dbaa786c584e5ffe05a0f79571b9c2fe7d16a1a1fb390e5d83b437d7a1ff3/signature-3: status 500 (Internal Server Error)" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 27 18:01:33 crc kubenswrapper[4752]: E0227 18:01:33.652206 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lzkkx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-w65lq_openshift-marketplace(55510274-59d6-4e9b-9917-aba8615e66a1): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-operator-index@sha256=340dbaa786c584e5ffe05a0f79571b9c2fe7d16a1a1fb390e5d83b437d7a1ff3/signature-3: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 18:01:33 crc kubenswrapper[4752]: E0227 18:01:33.653360 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-operator-index@sha256=340dbaa786c584e5ffe05a0f79571b9c2fe7d16a1a1fb390e5d83b437d7a1ff3/signature-3: status 500 (Internal Server Error)\"" pod="openshift-marketplace/redhat-operators-w65lq" podUID="55510274-59d6-4e9b-9917-aba8615e66a1" Feb 27 18:01:34 crc kubenswrapper[4752]: E0227 18:01:34.910461 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f\\\"\"" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" podUID="b3272ad9-e002-48b6-92b4-27da6186f45a" Feb 27 18:01:36 crc kubenswrapper[4752]: I0227 18:01:36.323616 4752 patch_prober.go:28] interesting pod/machine-config-daemon-cm8wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 18:01:36 crc kubenswrapper[4752]: I0227 18:01:36.323711 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 18:01:44 crc kubenswrapper[4752]: E0227 18:01:44.909423 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-w65lq" podUID="55510274-59d6-4e9b-9917-aba8615e66a1" Feb 27 18:01:45 crc kubenswrapper[4752]: E0227 18:01:45.909635 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f\\\"\"" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" podUID="b3272ad9-e002-48b6-92b4-27da6186f45a" Feb 27 18:01:58 crc kubenswrapper[4752]: E0227 18:01:58.909452 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-w65lq" podUID="55510274-59d6-4e9b-9917-aba8615e66a1" Feb 27 18:01:59 crc kubenswrapper[4752]: E0227 18:01:59.909045 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f\\\"\"" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" podUID="b3272ad9-e002-48b6-92b4-27da6186f45a" Feb 27 18:02:00 crc kubenswrapper[4752]: I0227 18:02:00.176911 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536922-wtljg"] Feb 27 18:02:00 crc kubenswrapper[4752]: I0227 18:02:00.178385 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536922-wtljg" Feb 27 18:02:00 crc kubenswrapper[4752]: I0227 18:02:00.181068 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wtt6d" Feb 27 18:02:00 crc kubenswrapper[4752]: I0227 18:02:00.181192 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 18:02:00 crc kubenswrapper[4752]: I0227 18:02:00.184511 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 18:02:00 crc kubenswrapper[4752]: I0227 18:02:00.190337 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536922-wtljg"] Feb 27 18:02:00 crc kubenswrapper[4752]: I0227 18:02:00.224830 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp76b\" (UniqueName: \"kubernetes.io/projected/1fabbabe-cc23-4da0-9b3e-33784a522df8-kube-api-access-zp76b\") pod \"auto-csr-approver-29536922-wtljg\" (UID: \"1fabbabe-cc23-4da0-9b3e-33784a522df8\") " pod="openshift-infra/auto-csr-approver-29536922-wtljg" Feb 27 18:02:00 crc kubenswrapper[4752]: I0227 18:02:00.326251 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp76b\" (UniqueName: \"kubernetes.io/projected/1fabbabe-cc23-4da0-9b3e-33784a522df8-kube-api-access-zp76b\") pod \"auto-csr-approver-29536922-wtljg\" (UID: \"1fabbabe-cc23-4da0-9b3e-33784a522df8\") " pod="openshift-infra/auto-csr-approver-29536922-wtljg" Feb 27 18:02:00 crc kubenswrapper[4752]: I0227 18:02:00.362244 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp76b\" (UniqueName: \"kubernetes.io/projected/1fabbabe-cc23-4da0-9b3e-33784a522df8-kube-api-access-zp76b\") pod \"auto-csr-approver-29536922-wtljg\" (UID: \"1fabbabe-cc23-4da0-9b3e-33784a522df8\") " pod="openshift-infra/auto-csr-approver-29536922-wtljg" Feb 27 18:02:00 crc kubenswrapper[4752]: I0227 18:02:00.514234 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536922-wtljg" Feb 27 18:02:00 crc kubenswrapper[4752]: I0227 18:02:00.757122 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536922-wtljg"] Feb 27 18:02:00 crc kubenswrapper[4752]: I0227 18:02:00.874186 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536922-wtljg" event={"ID":"1fabbabe-cc23-4da0-9b3e-33784a522df8","Type":"ContainerStarted","Data":"cc140d41c5c17f01f5e547ea5841f9bcf01e43e9bb82ee57ad9c03f1584e3da2"} Feb 27 18:02:01 crc kubenswrapper[4752]: E0227 18:02:01.726893 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 27 18:02:01 crc kubenswrapper[4752]: E0227 18:02:01.727494 4752 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 18:02:01 crc kubenswrapper[4752]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 27 18:02:01 crc kubenswrapper[4752]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zp76b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29536922-wtljg_openshift-infra(1fabbabe-cc23-4da0-9b3e-33784a522df8): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error) Feb 27 18:02:01 crc kubenswrapper[4752]: > logger="UnhandledError" Feb 27 18:02:01 crc kubenswrapper[4752]: E0227 18:02:01.728789 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)\"" pod="openshift-infra/auto-csr-approver-29536922-wtljg" podUID="1fabbabe-cc23-4da0-9b3e-33784a522df8" Feb 27 18:02:01 crc kubenswrapper[4752]: E0227 18:02:01.884905 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536922-wtljg" podUID="1fabbabe-cc23-4da0-9b3e-33784a522df8" Feb 27 18:02:06 crc kubenswrapper[4752]: I0227 18:02:06.325994 4752 patch_prober.go:28] interesting pod/machine-config-daemon-cm8wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 18:02:06 crc kubenswrapper[4752]: I0227 18:02:06.326803 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 18:02:06 crc kubenswrapper[4752]: I0227 18:02:06.326871 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" Feb 27 18:02:06 crc kubenswrapper[4752]: I0227 18:02:06.327868 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2d15bd23de76affad936b2b453b80049294c778d3446231b8cd3ac1c0f04c31a"} pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 18:02:06 crc kubenswrapper[4752]: I0227 18:02:06.327987 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" containerName="machine-config-daemon" containerID="cri-o://2d15bd23de76affad936b2b453b80049294c778d3446231b8cd3ac1c0f04c31a" gracePeriod=600 Feb 27 18:02:06 crc kubenswrapper[4752]: E0227 18:02:06.463667 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cm8wb_openshift-machine-config-operator(53ce186c-640f-4ade-94e1-587c1440fe87)\"" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" Feb 27 18:02:06 crc kubenswrapper[4752]: I0227 18:02:06.924065 4752 generic.go:334] "Generic (PLEG): container finished" podID="53ce186c-640f-4ade-94e1-587c1440fe87" containerID="2d15bd23de76affad936b2b453b80049294c778d3446231b8cd3ac1c0f04c31a" exitCode=0 Feb 27 18:02:06 crc kubenswrapper[4752]: I0227 18:02:06.924128 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" event={"ID":"53ce186c-640f-4ade-94e1-587c1440fe87","Type":"ContainerDied","Data":"2d15bd23de76affad936b2b453b80049294c778d3446231b8cd3ac1c0f04c31a"} Feb 27 18:02:06 crc kubenswrapper[4752]: I0227 18:02:06.924209 4752 scope.go:117] "RemoveContainer" containerID="89c7c62ab067a83d8206093b66c0f0d30eb19eaa1dee526f56d7b8a3fdbbed43" Feb 27 18:02:06 crc kubenswrapper[4752]: I0227 18:02:06.925074 4752 scope.go:117] "RemoveContainer" containerID="2d15bd23de76affad936b2b453b80049294c778d3446231b8cd3ac1c0f04c31a" Feb 27 18:02:06 crc kubenswrapper[4752]: E0227 18:02:06.925630 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cm8wb_openshift-machine-config-operator(53ce186c-640f-4ade-94e1-587c1440fe87)\"" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" Feb 27 18:02:11 crc kubenswrapper[4752]: E0227 18:02:11.910721 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-w65lq" podUID="55510274-59d6-4e9b-9917-aba8615e66a1" Feb 27 18:02:12 crc kubenswrapper[4752]: E0227 18:02:12.909217 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f\\\"\"" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" podUID="b3272ad9-e002-48b6-92b4-27da6186f45a" Feb 27 18:02:17 crc kubenswrapper[4752]: I0227 18:02:17.907194 4752 scope.go:117] "RemoveContainer" containerID="2d15bd23de76affad936b2b453b80049294c778d3446231b8cd3ac1c0f04c31a" Feb 27 18:02:17 crc kubenswrapper[4752]: E0227 18:02:17.908426 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cm8wb_openshift-machine-config-operator(53ce186c-640f-4ade-94e1-587c1440fe87)\"" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" Feb 27 18:02:26 crc kubenswrapper[4752]: I0227 18:02:26.058425 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w65lq" event={"ID":"55510274-59d6-4e9b-9917-aba8615e66a1","Type":"ContainerStarted","Data":"af8868010e882bcfb4faffa7be662999ebc51f0e8830b903126ad42a69fcba79"} Feb 27 18:02:27 crc kubenswrapper[4752]: I0227 18:02:27.068875 4752 generic.go:334] "Generic (PLEG): container finished" podID="55510274-59d6-4e9b-9917-aba8615e66a1" containerID="af8868010e882bcfb4faffa7be662999ebc51f0e8830b903126ad42a69fcba79" exitCode=0 Feb 27 18:02:27 crc kubenswrapper[4752]: I0227 18:02:27.069012 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w65lq" event={"ID":"55510274-59d6-4e9b-9917-aba8615e66a1","Type":"ContainerDied","Data":"af8868010e882bcfb4faffa7be662999ebc51f0e8830b903126ad42a69fcba79"} Feb 27 18:02:27 crc kubenswrapper[4752]: E0227 18:02:27.908665 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f\\\"\"" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" podUID="b3272ad9-e002-48b6-92b4-27da6186f45a" Feb 27 18:02:28 crc kubenswrapper[4752]: I0227 18:02:28.076516 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w65lq" event={"ID":"55510274-59d6-4e9b-9917-aba8615e66a1","Type":"ContainerStarted","Data":"28e0eb007271d40c0cb5964aedd96047533551354bac5136b1fb8d8f72d7f709"} Feb 27 18:02:28 crc kubenswrapper[4752]: I0227 18:02:28.100817 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w65lq" podStartSLOduration=1.91825431 podStartE2EDuration="1m38.10080041s" podCreationTimestamp="2026-02-27 18:00:50 +0000 UTC" firstStartedPulling="2026-02-27 18:00:51.362350434 +0000 UTC m=+1551.269167495" lastFinishedPulling="2026-02-27 18:02:27.544896744 +0000 UTC m=+1647.451713595" observedRunningTime="2026-02-27 18:02:28.093775177 +0000 UTC m=+1648.000592028" watchObservedRunningTime="2026-02-27 18:02:28.10080041 +0000 UTC m=+1648.007617261" Feb 27 18:02:30 crc kubenswrapper[4752]: I0227 18:02:30.769928 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w65lq" Feb 27 18:02:30 crc kubenswrapper[4752]: I0227 18:02:30.770026 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w65lq" Feb 27 18:02:31 crc kubenswrapper[4752]: I0227 18:02:31.816720 4752 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w65lq" podUID="55510274-59d6-4e9b-9917-aba8615e66a1" containerName="registry-server" probeResult="failure" output=< Feb 27 18:02:31 crc kubenswrapper[4752]: timeout: failed to connect service ":50051" within 1s Feb 27 18:02:31 crc kubenswrapper[4752]: > Feb 27 18:02:32 crc kubenswrapper[4752]: I0227 18:02:32.907111 4752 scope.go:117] "RemoveContainer" containerID="2d15bd23de76affad936b2b453b80049294c778d3446231b8cd3ac1c0f04c31a" Feb 27 18:02:32 crc kubenswrapper[4752]: E0227 18:02:32.908028 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cm8wb_openshift-machine-config-operator(53ce186c-640f-4ade-94e1-587c1440fe87)\"" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" Feb 27 18:02:38 crc kubenswrapper[4752]: E0227 18:02:38.910207 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f\\\"\"" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" podUID="b3272ad9-e002-48b6-92b4-27da6186f45a" Feb 27 18:02:39 crc kubenswrapper[4752]: I0227 18:02:39.194772 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rhcc8"] Feb 27 18:02:39 crc kubenswrapper[4752]: I0227 18:02:39.197076 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rhcc8" Feb 27 18:02:39 crc kubenswrapper[4752]: I0227 18:02:39.210533 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rhcc8"] Feb 27 18:02:39 crc kubenswrapper[4752]: I0227 18:02:39.223802 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6911b7cf-13b0-440f-bf00-c68b05620a51-utilities\") pod \"certified-operators-rhcc8\" (UID: \"6911b7cf-13b0-440f-bf00-c68b05620a51\") " pod="openshift-marketplace/certified-operators-rhcc8" Feb 27 18:02:39 crc kubenswrapper[4752]: I0227 18:02:39.223889 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk92c\" (UniqueName: \"kubernetes.io/projected/6911b7cf-13b0-440f-bf00-c68b05620a51-kube-api-access-pk92c\") pod \"certified-operators-rhcc8\" (UID: \"6911b7cf-13b0-440f-bf00-c68b05620a51\") " pod="openshift-marketplace/certified-operators-rhcc8" Feb 27 18:02:39 crc kubenswrapper[4752]: I0227 18:02:39.224345 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6911b7cf-13b0-440f-bf00-c68b05620a51-catalog-content\") pod \"certified-operators-rhcc8\" (UID: \"6911b7cf-13b0-440f-bf00-c68b05620a51\") " pod="openshift-marketplace/certified-operators-rhcc8" Feb 27 18:02:39 crc kubenswrapper[4752]: I0227 18:02:39.325436 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6911b7cf-13b0-440f-bf00-c68b05620a51-utilities\") pod \"certified-operators-rhcc8\" (UID: \"6911b7cf-13b0-440f-bf00-c68b05620a51\") " pod="openshift-marketplace/certified-operators-rhcc8" Feb 27 18:02:39 crc kubenswrapper[4752]: I0227 18:02:39.325741 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk92c\" (UniqueName: \"kubernetes.io/projected/6911b7cf-13b0-440f-bf00-c68b05620a51-kube-api-access-pk92c\") pod \"certified-operators-rhcc8\" (UID: \"6911b7cf-13b0-440f-bf00-c68b05620a51\") " pod="openshift-marketplace/certified-operators-rhcc8" Feb 27 18:02:39 crc kubenswrapper[4752]: I0227 18:02:39.325919 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6911b7cf-13b0-440f-bf00-c68b05620a51-catalog-content\") pod \"certified-operators-rhcc8\" (UID: \"6911b7cf-13b0-440f-bf00-c68b05620a51\") " pod="openshift-marketplace/certified-operators-rhcc8" Feb 27 18:02:39 crc kubenswrapper[4752]: I0227 18:02:39.325946 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6911b7cf-13b0-440f-bf00-c68b05620a51-utilities\") pod \"certified-operators-rhcc8\" (UID: \"6911b7cf-13b0-440f-bf00-c68b05620a51\") " pod="openshift-marketplace/certified-operators-rhcc8" Feb 27 18:02:39 crc kubenswrapper[4752]: I0227 18:02:39.326617 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6911b7cf-13b0-440f-bf00-c68b05620a51-catalog-content\") pod \"certified-operators-rhcc8\" (UID: \"6911b7cf-13b0-440f-bf00-c68b05620a51\") " pod="openshift-marketplace/certified-operators-rhcc8" Feb 27 18:02:39 crc kubenswrapper[4752]: I0227 18:02:39.351336 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk92c\" (UniqueName: \"kubernetes.io/projected/6911b7cf-13b0-440f-bf00-c68b05620a51-kube-api-access-pk92c\") pod \"certified-operators-rhcc8\" (UID: \"6911b7cf-13b0-440f-bf00-c68b05620a51\") " pod="openshift-marketplace/certified-operators-rhcc8" Feb 27 18:02:39 crc kubenswrapper[4752]: I0227 18:02:39.524242 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rhcc8" Feb 27 18:02:40 crc kubenswrapper[4752]: I0227 18:02:40.035314 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rhcc8"] Feb 27 18:02:40 crc kubenswrapper[4752]: I0227 18:02:40.164180 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rhcc8" event={"ID":"6911b7cf-13b0-440f-bf00-c68b05620a51","Type":"ContainerStarted","Data":"a31a71ebff017d1b722e83482d8aafdbf598b41e8f27f8a61d9c52c5443bed7e"} Feb 27 18:02:40 crc kubenswrapper[4752]: I0227 18:02:40.843631 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w65lq" Feb 27 18:02:40 crc kubenswrapper[4752]: I0227 18:02:40.937008 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w65lq" Feb 27 18:02:41 crc kubenswrapper[4752]: I0227 18:02:41.177926 4752 generic.go:334] "Generic (PLEG): container finished" podID="6911b7cf-13b0-440f-bf00-c68b05620a51" containerID="e943e0a5a4fffaec30d52c7a5094a603b0add2b8b50242acacfb0080aa0ec684" exitCode=0 Feb 27 18:02:41 crc kubenswrapper[4752]: I0227 18:02:41.178064 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rhcc8" event={"ID":"6911b7cf-13b0-440f-bf00-c68b05620a51","Type":"ContainerDied","Data":"e943e0a5a4fffaec30d52c7a5094a603b0add2b8b50242acacfb0080aa0ec684"} Feb 27 18:02:42 crc kubenswrapper[4752]: I0227 18:02:42.191055 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rhcc8" event={"ID":"6911b7cf-13b0-440f-bf00-c68b05620a51","Type":"ContainerStarted","Data":"5594571e1dba73ef559169b1c9b9ea03c0346ae8003fea87b0e40dc4d1a50caf"} Feb 27 18:02:43 crc kubenswrapper[4752]: I0227 18:02:43.163702 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w65lq"] Feb 27 18:02:43 crc kubenswrapper[4752]: I0227 18:02:43.164025 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w65lq" podUID="55510274-59d6-4e9b-9917-aba8615e66a1" containerName="registry-server" containerID="cri-o://28e0eb007271d40c0cb5964aedd96047533551354bac5136b1fb8d8f72d7f709" gracePeriod=2 Feb 27 18:02:43 crc kubenswrapper[4752]: I0227 18:02:43.214320 4752 generic.go:334] "Generic (PLEG): container finished" podID="6911b7cf-13b0-440f-bf00-c68b05620a51" containerID="5594571e1dba73ef559169b1c9b9ea03c0346ae8003fea87b0e40dc4d1a50caf" exitCode=0 Feb 27 18:02:43 crc kubenswrapper[4752]: I0227 18:02:43.214386 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rhcc8" event={"ID":"6911b7cf-13b0-440f-bf00-c68b05620a51","Type":"ContainerDied","Data":"5594571e1dba73ef559169b1c9b9ea03c0346ae8003fea87b0e40dc4d1a50caf"} Feb 27 18:02:43 crc kubenswrapper[4752]: I0227 18:02:43.626268 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w65lq" Feb 27 18:02:43 crc kubenswrapper[4752]: I0227 18:02:43.821948 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55510274-59d6-4e9b-9917-aba8615e66a1-utilities\") pod \"55510274-59d6-4e9b-9917-aba8615e66a1\" (UID: \"55510274-59d6-4e9b-9917-aba8615e66a1\") " Feb 27 18:02:43 crc kubenswrapper[4752]: I0227 18:02:43.822347 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzkkx\" (UniqueName: \"kubernetes.io/projected/55510274-59d6-4e9b-9917-aba8615e66a1-kube-api-access-lzkkx\") pod \"55510274-59d6-4e9b-9917-aba8615e66a1\" (UID: \"55510274-59d6-4e9b-9917-aba8615e66a1\") " Feb 27 18:02:43 crc kubenswrapper[4752]: I0227 18:02:43.822508 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55510274-59d6-4e9b-9917-aba8615e66a1-catalog-content\") pod \"55510274-59d6-4e9b-9917-aba8615e66a1\" (UID: \"55510274-59d6-4e9b-9917-aba8615e66a1\") " Feb 27 18:02:43 crc kubenswrapper[4752]: I0227 18:02:43.822937 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55510274-59d6-4e9b-9917-aba8615e66a1-utilities" (OuterVolumeSpecName: "utilities") pod "55510274-59d6-4e9b-9917-aba8615e66a1" (UID: "55510274-59d6-4e9b-9917-aba8615e66a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 18:02:43 crc kubenswrapper[4752]: I0227 18:02:43.828969 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55510274-59d6-4e9b-9917-aba8615e66a1-kube-api-access-lzkkx" (OuterVolumeSpecName: "kube-api-access-lzkkx") pod "55510274-59d6-4e9b-9917-aba8615e66a1" (UID: "55510274-59d6-4e9b-9917-aba8615e66a1"). InnerVolumeSpecName "kube-api-access-lzkkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 18:02:43 crc kubenswrapper[4752]: I0227 18:02:43.923687 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55510274-59d6-4e9b-9917-aba8615e66a1-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 18:02:43 crc kubenswrapper[4752]: I0227 18:02:43.923726 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzkkx\" (UniqueName: \"kubernetes.io/projected/55510274-59d6-4e9b-9917-aba8615e66a1-kube-api-access-lzkkx\") on node \"crc\" DevicePath \"\"" Feb 27 18:02:43 crc kubenswrapper[4752]: I0227 18:02:43.965159 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55510274-59d6-4e9b-9917-aba8615e66a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55510274-59d6-4e9b-9917-aba8615e66a1" (UID: "55510274-59d6-4e9b-9917-aba8615e66a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 18:02:44 crc kubenswrapper[4752]: I0227 18:02:44.024883 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55510274-59d6-4e9b-9917-aba8615e66a1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 18:02:44 crc kubenswrapper[4752]: E0227 18:02:44.206961 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 27 18:02:44 crc kubenswrapper[4752]: E0227 18:02:44.207075 4752 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 18:02:44 crc kubenswrapper[4752]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 27 18:02:44 crc kubenswrapper[4752]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zp76b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29536922-wtljg_openshift-infra(1fabbabe-cc23-4da0-9b3e-33784a522df8): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error) Feb 27 18:02:44 crc kubenswrapper[4752]: > logger="UnhandledError" Feb 27 18:02:44 crc kubenswrapper[4752]: E0227 18:02:44.208261 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)\"" pod="openshift-infra/auto-csr-approver-29536922-wtljg" podUID="1fabbabe-cc23-4da0-9b3e-33784a522df8" Feb 27 18:02:44 crc kubenswrapper[4752]: I0227 18:02:44.222787 4752 generic.go:334] "Generic (PLEG): container finished" podID="55510274-59d6-4e9b-9917-aba8615e66a1" containerID="28e0eb007271d40c0cb5964aedd96047533551354bac5136b1fb8d8f72d7f709" exitCode=0 Feb 27 18:02:44 crc kubenswrapper[4752]: I0227 18:02:44.222857 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w65lq" Feb 27 18:02:44 crc kubenswrapper[4752]: I0227 18:02:44.222860 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w65lq" event={"ID":"55510274-59d6-4e9b-9917-aba8615e66a1","Type":"ContainerDied","Data":"28e0eb007271d40c0cb5964aedd96047533551354bac5136b1fb8d8f72d7f709"} Feb 27 18:02:44 crc kubenswrapper[4752]: I0227 18:02:44.223055 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w65lq" event={"ID":"55510274-59d6-4e9b-9917-aba8615e66a1","Type":"ContainerDied","Data":"40145410ac6dda3caa0c92e11ceac753ce9c1b00b7cb73b5e48527eb13e18e45"} Feb 27 18:02:44 crc kubenswrapper[4752]: I0227 18:02:44.223090 4752 scope.go:117] "RemoveContainer" containerID="28e0eb007271d40c0cb5964aedd96047533551354bac5136b1fb8d8f72d7f709" Feb 27 18:02:44 crc kubenswrapper[4752]: I0227 18:02:44.225799 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rhcc8" event={"ID":"6911b7cf-13b0-440f-bf00-c68b05620a51","Type":"ContainerStarted","Data":"aba771879258aefacc986564cf973ca7daa1b377bf48da3ae52963cdc659e205"} Feb 27 18:02:44 crc kubenswrapper[4752]: I0227 18:02:44.246699 4752 scope.go:117] "RemoveContainer" containerID="af8868010e882bcfb4faffa7be662999ebc51f0e8830b903126ad42a69fcba79" Feb 27 18:02:44 crc kubenswrapper[4752]: I0227 18:02:44.276485 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rhcc8" podStartSLOduration=2.725715148 podStartE2EDuration="5.276456707s" podCreationTimestamp="2026-02-27 18:02:39 +0000 UTC" firstStartedPulling="2026-02-27 18:02:41.184344179 +0000 UTC m=+1661.091161070" lastFinishedPulling="2026-02-27 18:02:43.735085738 +0000 UTC m=+1663.641902629" observedRunningTime="2026-02-27 18:02:44.257780057 +0000 UTC m=+1664.164596948" watchObservedRunningTime="2026-02-27 18:02:44.276456707 +0000 UTC m=+1664.183273598" Feb 27 18:02:44 crc kubenswrapper[4752]: I0227 18:02:44.283464 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w65lq"] Feb 27 18:02:44 crc kubenswrapper[4752]: I0227 18:02:44.291870 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w65lq"] Feb 27 18:02:44 crc kubenswrapper[4752]: I0227 18:02:44.301303 4752 scope.go:117] "RemoveContainer" containerID="17ba22dfcfa911d57ac49a4667e567ff02596899497cd786bef45f69cf5df41f" Feb 27 18:02:44 crc kubenswrapper[4752]: I0227 18:02:44.315741 4752 scope.go:117] "RemoveContainer" containerID="28e0eb007271d40c0cb5964aedd96047533551354bac5136b1fb8d8f72d7f709" Feb 27 18:02:44 crc kubenswrapper[4752]: E0227 18:02:44.316304 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28e0eb007271d40c0cb5964aedd96047533551354bac5136b1fb8d8f72d7f709\": container with ID starting with 28e0eb007271d40c0cb5964aedd96047533551354bac5136b1fb8d8f72d7f709 not found: ID does not exist" containerID="28e0eb007271d40c0cb5964aedd96047533551354bac5136b1fb8d8f72d7f709" Feb 27 18:02:44 crc kubenswrapper[4752]: I0227 18:02:44.316370 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28e0eb007271d40c0cb5964aedd96047533551354bac5136b1fb8d8f72d7f709"} err="failed to get container status \"28e0eb007271d40c0cb5964aedd96047533551354bac5136b1fb8d8f72d7f709\": rpc error: code = NotFound desc = could not find container \"28e0eb007271d40c0cb5964aedd96047533551354bac5136b1fb8d8f72d7f709\": container with ID starting with 28e0eb007271d40c0cb5964aedd96047533551354bac5136b1fb8d8f72d7f709 not found: ID does not exist" Feb 27 18:02:44 crc kubenswrapper[4752]: I0227 18:02:44.316411 4752 scope.go:117] "RemoveContainer" containerID="af8868010e882bcfb4faffa7be662999ebc51f0e8830b903126ad42a69fcba79" Feb 27 18:02:44 crc kubenswrapper[4752]: E0227 18:02:44.316726 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af8868010e882bcfb4faffa7be662999ebc51f0e8830b903126ad42a69fcba79\": container with ID starting with af8868010e882bcfb4faffa7be662999ebc51f0e8830b903126ad42a69fcba79 not found: ID does not exist" containerID="af8868010e882bcfb4faffa7be662999ebc51f0e8830b903126ad42a69fcba79" Feb 27 18:02:44 crc kubenswrapper[4752]: I0227 18:02:44.316762 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af8868010e882bcfb4faffa7be662999ebc51f0e8830b903126ad42a69fcba79"} err="failed to get container status \"af8868010e882bcfb4faffa7be662999ebc51f0e8830b903126ad42a69fcba79\": rpc error: code = NotFound desc = could not find container \"af8868010e882bcfb4faffa7be662999ebc51f0e8830b903126ad42a69fcba79\": container with ID starting with af8868010e882bcfb4faffa7be662999ebc51f0e8830b903126ad42a69fcba79 not found: ID does not exist" Feb 27 18:02:44 crc kubenswrapper[4752]: I0227 18:02:44.316787 4752 scope.go:117] "RemoveContainer" containerID="17ba22dfcfa911d57ac49a4667e567ff02596899497cd786bef45f69cf5df41f" Feb 27 18:02:44 crc kubenswrapper[4752]: E0227 18:02:44.317089 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17ba22dfcfa911d57ac49a4667e567ff02596899497cd786bef45f69cf5df41f\": container with ID starting with 17ba22dfcfa911d57ac49a4667e567ff02596899497cd786bef45f69cf5df41f not found: ID does not exist" containerID="17ba22dfcfa911d57ac49a4667e567ff02596899497cd786bef45f69cf5df41f" Feb 27 18:02:44 crc kubenswrapper[4752]: I0227 18:02:44.317134 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17ba22dfcfa911d57ac49a4667e567ff02596899497cd786bef45f69cf5df41f"} err="failed to get container status \"17ba22dfcfa911d57ac49a4667e567ff02596899497cd786bef45f69cf5df41f\": rpc error: code = NotFound desc = could not find container \"17ba22dfcfa911d57ac49a4667e567ff02596899497cd786bef45f69cf5df41f\": container with ID starting with 17ba22dfcfa911d57ac49a4667e567ff02596899497cd786bef45f69cf5df41f not found: ID does not exist" Feb 27 18:02:44 crc kubenswrapper[4752]: I0227 18:02:44.907280 4752 scope.go:117] "RemoveContainer" containerID="2d15bd23de76affad936b2b453b80049294c778d3446231b8cd3ac1c0f04c31a" Feb 27 18:02:44 crc kubenswrapper[4752]: E0227 18:02:44.907937 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cm8wb_openshift-machine-config-operator(53ce186c-640f-4ade-94e1-587c1440fe87)\"" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" Feb 27 18:02:44 crc kubenswrapper[4752]: I0227 18:02:44.916179 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55510274-59d6-4e9b-9917-aba8615e66a1" path="/var/lib/kubelet/pods/55510274-59d6-4e9b-9917-aba8615e66a1/volumes" Feb 27 18:02:49 crc kubenswrapper[4752]: I0227 18:02:49.524716 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rhcc8" Feb 27 18:02:49 crc kubenswrapper[4752]: I0227 18:02:49.525186 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rhcc8" Feb 27 18:02:49 crc kubenswrapper[4752]: I0227 18:02:49.574278 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rhcc8" Feb 27 18:02:50 crc kubenswrapper[4752]: I0227 18:02:50.320048 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rhcc8" Feb 27 18:02:50 crc kubenswrapper[4752]: I0227 18:02:50.381307 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rhcc8"] Feb 27 18:02:50 crc kubenswrapper[4752]: E0227 18:02:50.911883 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f\\\"\"" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" podUID="b3272ad9-e002-48b6-92b4-27da6186f45a" Feb 27 18:02:52 crc kubenswrapper[4752]: I0227 18:02:52.285769 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rhcc8" podUID="6911b7cf-13b0-440f-bf00-c68b05620a51" containerName="registry-server" containerID="cri-o://aba771879258aefacc986564cf973ca7daa1b377bf48da3ae52963cdc659e205" gracePeriod=2 Feb 27 18:02:52 crc kubenswrapper[4752]: I0227 18:02:52.707187 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rhcc8" Feb 27 18:02:52 crc kubenswrapper[4752]: I0227 18:02:52.840719 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6911b7cf-13b0-440f-bf00-c68b05620a51-catalog-content\") pod \"6911b7cf-13b0-440f-bf00-c68b05620a51\" (UID: \"6911b7cf-13b0-440f-bf00-c68b05620a51\") " Feb 27 18:02:52 crc kubenswrapper[4752]: I0227 18:02:52.840794 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6911b7cf-13b0-440f-bf00-c68b05620a51-utilities\") pod \"6911b7cf-13b0-440f-bf00-c68b05620a51\" (UID: \"6911b7cf-13b0-440f-bf00-c68b05620a51\") " Feb 27 18:02:52 crc kubenswrapper[4752]: I0227 18:02:52.840828 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pk92c\" (UniqueName: \"kubernetes.io/projected/6911b7cf-13b0-440f-bf00-c68b05620a51-kube-api-access-pk92c\") pod \"6911b7cf-13b0-440f-bf00-c68b05620a51\" (UID: \"6911b7cf-13b0-440f-bf00-c68b05620a51\") " Feb 27 18:02:52 crc kubenswrapper[4752]: I0227 18:02:52.842365 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6911b7cf-13b0-440f-bf00-c68b05620a51-utilities" (OuterVolumeSpecName: "utilities") pod "6911b7cf-13b0-440f-bf00-c68b05620a51" (UID: "6911b7cf-13b0-440f-bf00-c68b05620a51"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 18:02:52 crc kubenswrapper[4752]: I0227 18:02:52.849725 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6911b7cf-13b0-440f-bf00-c68b05620a51-kube-api-access-pk92c" (OuterVolumeSpecName: "kube-api-access-pk92c") pod "6911b7cf-13b0-440f-bf00-c68b05620a51" (UID: "6911b7cf-13b0-440f-bf00-c68b05620a51"). InnerVolumeSpecName "kube-api-access-pk92c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 18:02:52 crc kubenswrapper[4752]: I0227 18:02:52.942976 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6911b7cf-13b0-440f-bf00-c68b05620a51-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 18:02:52 crc kubenswrapper[4752]: I0227 18:02:52.943068 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pk92c\" (UniqueName: \"kubernetes.io/projected/6911b7cf-13b0-440f-bf00-c68b05620a51-kube-api-access-pk92c\") on node \"crc\" DevicePath \"\"" Feb 27 18:02:53 crc kubenswrapper[4752]: I0227 18:02:53.255473 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6911b7cf-13b0-440f-bf00-c68b05620a51-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6911b7cf-13b0-440f-bf00-c68b05620a51" (UID: "6911b7cf-13b0-440f-bf00-c68b05620a51"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 18:02:53 crc kubenswrapper[4752]: I0227 18:02:53.297975 4752 generic.go:334] "Generic (PLEG): container finished" podID="6911b7cf-13b0-440f-bf00-c68b05620a51" containerID="aba771879258aefacc986564cf973ca7daa1b377bf48da3ae52963cdc659e205" exitCode=0 Feb 27 18:02:53 crc kubenswrapper[4752]: I0227 18:02:53.298035 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rhcc8" event={"ID":"6911b7cf-13b0-440f-bf00-c68b05620a51","Type":"ContainerDied","Data":"aba771879258aefacc986564cf973ca7daa1b377bf48da3ae52963cdc659e205"} Feb 27 18:02:53 crc kubenswrapper[4752]: I0227 18:02:53.298059 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rhcc8" Feb 27 18:02:53 crc kubenswrapper[4752]: I0227 18:02:53.298078 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rhcc8" event={"ID":"6911b7cf-13b0-440f-bf00-c68b05620a51","Type":"ContainerDied","Data":"a31a71ebff017d1b722e83482d8aafdbf598b41e8f27f8a61d9c52c5443bed7e"} Feb 27 18:02:53 crc kubenswrapper[4752]: I0227 18:02:53.298107 4752 scope.go:117] "RemoveContainer" containerID="aba771879258aefacc986564cf973ca7daa1b377bf48da3ae52963cdc659e205" Feb 27 18:02:53 crc kubenswrapper[4752]: I0227 18:02:53.326202 4752 scope.go:117] "RemoveContainer" containerID="5594571e1dba73ef559169b1c9b9ea03c0346ae8003fea87b0e40dc4d1a50caf" Feb 27 18:02:53 crc kubenswrapper[4752]: I0227 18:02:53.349329 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rhcc8"] Feb 27 18:02:53 crc kubenswrapper[4752]: I0227 18:02:53.350072 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6911b7cf-13b0-440f-bf00-c68b05620a51-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 18:02:53 crc kubenswrapper[4752]: I0227 18:02:53.358472 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rhcc8"] Feb 27 18:02:53 crc kubenswrapper[4752]: I0227 18:02:53.375289 4752 scope.go:117] "RemoveContainer" containerID="e943e0a5a4fffaec30d52c7a5094a603b0add2b8b50242acacfb0080aa0ec684" Feb 27 18:02:53 crc kubenswrapper[4752]: I0227 18:02:53.404352 4752 scope.go:117] "RemoveContainer" containerID="aba771879258aefacc986564cf973ca7daa1b377bf48da3ae52963cdc659e205" Feb 27 18:02:53 crc kubenswrapper[4752]: E0227 18:02:53.404947 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aba771879258aefacc986564cf973ca7daa1b377bf48da3ae52963cdc659e205\": container with ID starting with aba771879258aefacc986564cf973ca7daa1b377bf48da3ae52963cdc659e205 not found: ID does not exist" containerID="aba771879258aefacc986564cf973ca7daa1b377bf48da3ae52963cdc659e205" Feb 27 18:02:53 crc kubenswrapper[4752]: I0227 18:02:53.405115 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aba771879258aefacc986564cf973ca7daa1b377bf48da3ae52963cdc659e205"} err="failed to get container status \"aba771879258aefacc986564cf973ca7daa1b377bf48da3ae52963cdc659e205\": rpc error: code = NotFound desc = could not find container \"aba771879258aefacc986564cf973ca7daa1b377bf48da3ae52963cdc659e205\": container with ID starting with aba771879258aefacc986564cf973ca7daa1b377bf48da3ae52963cdc659e205 not found: ID does not exist" Feb 27 18:02:53 crc kubenswrapper[4752]: I0227 18:02:53.405167 4752 scope.go:117] "RemoveContainer" containerID="5594571e1dba73ef559169b1c9b9ea03c0346ae8003fea87b0e40dc4d1a50caf" Feb 27 18:02:53 crc kubenswrapper[4752]: E0227 18:02:53.405639 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5594571e1dba73ef559169b1c9b9ea03c0346ae8003fea87b0e40dc4d1a50caf\": container with ID starting with 5594571e1dba73ef559169b1c9b9ea03c0346ae8003fea87b0e40dc4d1a50caf not found: ID does not exist" containerID="5594571e1dba73ef559169b1c9b9ea03c0346ae8003fea87b0e40dc4d1a50caf" Feb 27 18:02:53 crc kubenswrapper[4752]: I0227 18:02:53.405687 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5594571e1dba73ef559169b1c9b9ea03c0346ae8003fea87b0e40dc4d1a50caf"} err="failed to get container status \"5594571e1dba73ef559169b1c9b9ea03c0346ae8003fea87b0e40dc4d1a50caf\": rpc error: code = NotFound desc = could not find container \"5594571e1dba73ef559169b1c9b9ea03c0346ae8003fea87b0e40dc4d1a50caf\": container with ID starting with 5594571e1dba73ef559169b1c9b9ea03c0346ae8003fea87b0e40dc4d1a50caf not found: ID does not exist" Feb 27 18:02:53 crc kubenswrapper[4752]: I0227 18:02:53.405716 4752 scope.go:117] "RemoveContainer" containerID="e943e0a5a4fffaec30d52c7a5094a603b0add2b8b50242acacfb0080aa0ec684" Feb 27 18:02:53 crc kubenswrapper[4752]: E0227 18:02:53.406101 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e943e0a5a4fffaec30d52c7a5094a603b0add2b8b50242acacfb0080aa0ec684\": container with ID starting with e943e0a5a4fffaec30d52c7a5094a603b0add2b8b50242acacfb0080aa0ec684 not found: ID does not exist" containerID="e943e0a5a4fffaec30d52c7a5094a603b0add2b8b50242acacfb0080aa0ec684" Feb 27 18:02:53 crc kubenswrapper[4752]: I0227 18:02:53.406133 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e943e0a5a4fffaec30d52c7a5094a603b0add2b8b50242acacfb0080aa0ec684"} err="failed to get container status \"e943e0a5a4fffaec30d52c7a5094a603b0add2b8b50242acacfb0080aa0ec684\": rpc error: code = NotFound desc = could not find container \"e943e0a5a4fffaec30d52c7a5094a603b0add2b8b50242acacfb0080aa0ec684\": container with ID starting with e943e0a5a4fffaec30d52c7a5094a603b0add2b8b50242acacfb0080aa0ec684 not found: ID does not exist" Feb 27 18:02:53 crc kubenswrapper[4752]: I0227 18:02:53.627249 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8ql8m"] Feb 27 18:02:53 crc kubenswrapper[4752]: E0227 18:02:53.627685 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55510274-59d6-4e9b-9917-aba8615e66a1" containerName="extract-content" Feb 27 18:02:53 crc kubenswrapper[4752]: I0227 18:02:53.627828 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="55510274-59d6-4e9b-9917-aba8615e66a1" containerName="extract-content" Feb 27 18:02:53 crc kubenswrapper[4752]: E0227 18:02:53.627881 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6911b7cf-13b0-440f-bf00-c68b05620a51" containerName="extract-utilities" Feb 27 18:02:53 crc kubenswrapper[4752]: I0227 18:02:53.627899 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="6911b7cf-13b0-440f-bf00-c68b05620a51" containerName="extract-utilities" Feb 27 18:02:53 crc kubenswrapper[4752]: E0227 18:02:53.627922 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6911b7cf-13b0-440f-bf00-c68b05620a51" containerName="extract-content" Feb 27 18:02:53 crc kubenswrapper[4752]: I0227 18:02:53.627937 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="6911b7cf-13b0-440f-bf00-c68b05620a51" containerName="extract-content" Feb 27 18:02:53 crc kubenswrapper[4752]: E0227 18:02:53.627959 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55510274-59d6-4e9b-9917-aba8615e66a1" containerName="extract-utilities" Feb 27 18:02:53 crc kubenswrapper[4752]: I0227 18:02:53.627974 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="55510274-59d6-4e9b-9917-aba8615e66a1" containerName="extract-utilities" Feb 27 18:02:53 crc kubenswrapper[4752]: E0227 18:02:53.627993 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6911b7cf-13b0-440f-bf00-c68b05620a51" containerName="registry-server" Feb 27 18:02:53 crc kubenswrapper[4752]: I0227 18:02:53.628008 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="6911b7cf-13b0-440f-bf00-c68b05620a51" containerName="registry-server" Feb 27 18:02:53 crc kubenswrapper[4752]: E0227 18:02:53.628040 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55510274-59d6-4e9b-9917-aba8615e66a1" containerName="registry-server" Feb 27 18:02:53 crc kubenswrapper[4752]: I0227 18:02:53.628055 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="55510274-59d6-4e9b-9917-aba8615e66a1" containerName="registry-server" Feb 27 18:02:53 crc kubenswrapper[4752]: I0227 18:02:53.628355 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="55510274-59d6-4e9b-9917-aba8615e66a1" containerName="registry-server" Feb 27 18:02:53 crc kubenswrapper[4752]: I0227 18:02:53.628391 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="6911b7cf-13b0-440f-bf00-c68b05620a51" containerName="registry-server" Feb 27 18:02:53 crc kubenswrapper[4752]: I0227 18:02:53.630336 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8ql8m" Feb 27 18:02:53 crc kubenswrapper[4752]: I0227 18:02:53.640264 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8ql8m"] Feb 27 18:02:53 crc kubenswrapper[4752]: I0227 18:02:53.755735 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d92b437a-52a3-4c88-8ae9-a86b0e60fe5a-catalog-content\") pod \"community-operators-8ql8m\" (UID: \"d92b437a-52a3-4c88-8ae9-a86b0e60fe5a\") " pod="openshift-marketplace/community-operators-8ql8m" Feb 27 18:02:53 crc kubenswrapper[4752]: I0227 18:02:53.755820 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d92b437a-52a3-4c88-8ae9-a86b0e60fe5a-utilities\") pod \"community-operators-8ql8m\" (UID: \"d92b437a-52a3-4c88-8ae9-a86b0e60fe5a\") " pod="openshift-marketplace/community-operators-8ql8m" Feb 27 18:02:53 crc kubenswrapper[4752]: I0227 18:02:53.755855 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z4l5\" (UniqueName: \"kubernetes.io/projected/d92b437a-52a3-4c88-8ae9-a86b0e60fe5a-kube-api-access-2z4l5\") pod \"community-operators-8ql8m\" (UID: \"d92b437a-52a3-4c88-8ae9-a86b0e60fe5a\") " pod="openshift-marketplace/community-operators-8ql8m" Feb 27 18:02:53 crc kubenswrapper[4752]: I0227 18:02:53.857664 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d92b437a-52a3-4c88-8ae9-a86b0e60fe5a-catalog-content\") pod \"community-operators-8ql8m\" (UID: \"d92b437a-52a3-4c88-8ae9-a86b0e60fe5a\") " pod="openshift-marketplace/community-operators-8ql8m" Feb 27 18:02:53 crc kubenswrapper[4752]: I0227 18:02:53.858076 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d92b437a-52a3-4c88-8ae9-a86b0e60fe5a-catalog-content\") pod \"community-operators-8ql8m\" (UID: \"d92b437a-52a3-4c88-8ae9-a86b0e60fe5a\") " pod="openshift-marketplace/community-operators-8ql8m" Feb 27 18:02:53 crc kubenswrapper[4752]: I0227 18:02:53.858198 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d92b437a-52a3-4c88-8ae9-a86b0e60fe5a-utilities\") pod \"community-operators-8ql8m\" (UID: \"d92b437a-52a3-4c88-8ae9-a86b0e60fe5a\") " pod="openshift-marketplace/community-operators-8ql8m" Feb 27 18:02:53 crc kubenswrapper[4752]: I0227 18:02:53.858245 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z4l5\" (UniqueName: \"kubernetes.io/projected/d92b437a-52a3-4c88-8ae9-a86b0e60fe5a-kube-api-access-2z4l5\") pod \"community-operators-8ql8m\" (UID: \"d92b437a-52a3-4c88-8ae9-a86b0e60fe5a\") " pod="openshift-marketplace/community-operators-8ql8m" Feb 27 18:02:53 crc kubenswrapper[4752]: I0227 18:02:53.858778 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d92b437a-52a3-4c88-8ae9-a86b0e60fe5a-utilities\") pod \"community-operators-8ql8m\" (UID: \"d92b437a-52a3-4c88-8ae9-a86b0e60fe5a\") " pod="openshift-marketplace/community-operators-8ql8m" Feb 27 18:02:53 crc kubenswrapper[4752]: I0227 18:02:53.886300 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z4l5\" (UniqueName: \"kubernetes.io/projected/d92b437a-52a3-4c88-8ae9-a86b0e60fe5a-kube-api-access-2z4l5\") pod \"community-operators-8ql8m\" (UID: \"d92b437a-52a3-4c88-8ae9-a86b0e60fe5a\") " pod="openshift-marketplace/community-operators-8ql8m" Feb 27 18:02:53 crc kubenswrapper[4752]: I0227 18:02:53.959845 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8ql8m" Feb 27 18:02:54 crc kubenswrapper[4752]: I0227 18:02:54.215003 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8ql8m"] Feb 27 18:02:54 crc kubenswrapper[4752]: I0227 18:02:54.305777 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ql8m" event={"ID":"d92b437a-52a3-4c88-8ae9-a86b0e60fe5a","Type":"ContainerStarted","Data":"1028b67cdf88d0ff8bc46b591d2ea2209fc504cfa83ce31dbc426ef6a7459041"} Feb 27 18:02:54 crc kubenswrapper[4752]: I0227 18:02:54.920447 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6911b7cf-13b0-440f-bf00-c68b05620a51" path="/var/lib/kubelet/pods/6911b7cf-13b0-440f-bf00-c68b05620a51/volumes" Feb 27 18:02:55 crc kubenswrapper[4752]: I0227 18:02:55.313830 4752 generic.go:334] "Generic (PLEG): container finished" podID="d92b437a-52a3-4c88-8ae9-a86b0e60fe5a" containerID="f87fc19b4ecae2ac7b39a24042241916707dbef6469557e6152b4b4a3072d882" exitCode=0 Feb 27 18:02:55 crc kubenswrapper[4752]: I0227 18:02:55.313914 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ql8m" event={"ID":"d92b437a-52a3-4c88-8ae9-a86b0e60fe5a","Type":"ContainerDied","Data":"f87fc19b4ecae2ac7b39a24042241916707dbef6469557e6152b4b4a3072d882"} Feb 27 18:02:55 crc kubenswrapper[4752]: I0227 18:02:55.906701 4752 scope.go:117] "RemoveContainer" containerID="2d15bd23de76affad936b2b453b80049294c778d3446231b8cd3ac1c0f04c31a" Feb 27 18:02:55 crc kubenswrapper[4752]: E0227 18:02:55.907049 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cm8wb_openshift-machine-config-operator(53ce186c-640f-4ade-94e1-587c1440fe87)\"" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" Feb 27 18:02:56 crc kubenswrapper[4752]: E0227 18:02:56.210384 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/community-operator-index@sha256=886ecdbcb5b8f90338063f6476072fab73c2a9a65b9f2b3835b7bd01c69794c1/signature-2: status 500 (Internal Server Error)" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 27 18:02:56 crc kubenswrapper[4752]: E0227 18:02:56.210613 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2z4l5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-8ql8m_openshift-marketplace(d92b437a-52a3-4c88-8ae9-a86b0e60fe5a): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/community-operator-index@sha256=886ecdbcb5b8f90338063f6476072fab73c2a9a65b9f2b3835b7bd01c69794c1/signature-2: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 18:02:56 crc kubenswrapper[4752]: E0227 18:02:56.211939 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/community-operator-index@sha256=886ecdbcb5b8f90338063f6476072fab73c2a9a65b9f2b3835b7bd01c69794c1/signature-2: status 500 (Internal Server Error)\"" pod="openshift-marketplace/community-operators-8ql8m" podUID="d92b437a-52a3-4c88-8ae9-a86b0e60fe5a" Feb 27 18:02:56 crc kubenswrapper[4752]: E0227 18:02:56.324725 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-8ql8m" podUID="d92b437a-52a3-4c88-8ae9-a86b0e60fe5a" Feb 27 18:02:58 crc kubenswrapper[4752]: E0227 18:02:58.911386 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536922-wtljg" podUID="1fabbabe-cc23-4da0-9b3e-33784a522df8" Feb 27 18:03:05 crc kubenswrapper[4752]: E0227 18:03:05.908496 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f\\\"\"" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" podUID="b3272ad9-e002-48b6-92b4-27da6186f45a" Feb 27 18:03:06 crc kubenswrapper[4752]: I0227 18:03:06.906538 4752 scope.go:117] "RemoveContainer" containerID="2d15bd23de76affad936b2b453b80049294c778d3446231b8cd3ac1c0f04c31a" Feb 27 18:03:06 crc kubenswrapper[4752]: E0227 18:03:06.906770 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cm8wb_openshift-machine-config-operator(53ce186c-640f-4ade-94e1-587c1440fe87)\"" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" Feb 27 18:03:11 crc kubenswrapper[4752]: E0227 18:03:11.567463 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/community-operator-index@sha256=886ecdbcb5b8f90338063f6476072fab73c2a9a65b9f2b3835b7bd01c69794c1/signature-2: status 500 (Internal Server Error)" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 27 18:03:11 crc kubenswrapper[4752]: E0227 18:03:11.567970 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2z4l5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-8ql8m_openshift-marketplace(d92b437a-52a3-4c88-8ae9-a86b0e60fe5a): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/community-operator-index@sha256=886ecdbcb5b8f90338063f6476072fab73c2a9a65b9f2b3835b7bd01c69794c1/signature-2: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 18:03:11 crc kubenswrapper[4752]: E0227 18:03:11.570179 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/community-operator-index@sha256=886ecdbcb5b8f90338063f6476072fab73c2a9a65b9f2b3835b7bd01c69794c1/signature-2: status 500 (Internal Server Error)\"" pod="openshift-marketplace/community-operators-8ql8m" podUID="d92b437a-52a3-4c88-8ae9-a86b0e60fe5a" Feb 27 18:03:13 crc kubenswrapper[4752]: I0227 18:03:13.451042 4752 generic.go:334] "Generic (PLEG): container finished" podID="1fabbabe-cc23-4da0-9b3e-33784a522df8" containerID="4a9964ce2ca4ffbaace3501e2c1ca8e305f6c43e9a5c7ee80b79dc1763244271" exitCode=0 Feb 27 18:03:13 crc kubenswrapper[4752]: I0227 18:03:13.451408 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536922-wtljg" event={"ID":"1fabbabe-cc23-4da0-9b3e-33784a522df8","Type":"ContainerDied","Data":"4a9964ce2ca4ffbaace3501e2c1ca8e305f6c43e9a5c7ee80b79dc1763244271"} Feb 27 18:03:14 crc kubenswrapper[4752]: I0227 18:03:14.730838 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536922-wtljg" Feb 27 18:03:14 crc kubenswrapper[4752]: I0227 18:03:14.870961 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zp76b\" (UniqueName: \"kubernetes.io/projected/1fabbabe-cc23-4da0-9b3e-33784a522df8-kube-api-access-zp76b\") pod \"1fabbabe-cc23-4da0-9b3e-33784a522df8\" (UID: \"1fabbabe-cc23-4da0-9b3e-33784a522df8\") " Feb 27 18:03:14 crc kubenswrapper[4752]: I0227 18:03:14.879278 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fabbabe-cc23-4da0-9b3e-33784a522df8-kube-api-access-zp76b" (OuterVolumeSpecName: "kube-api-access-zp76b") pod "1fabbabe-cc23-4da0-9b3e-33784a522df8" (UID: "1fabbabe-cc23-4da0-9b3e-33784a522df8"). InnerVolumeSpecName "kube-api-access-zp76b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 18:03:14 crc kubenswrapper[4752]: I0227 18:03:14.972431 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zp76b\" (UniqueName: \"kubernetes.io/projected/1fabbabe-cc23-4da0-9b3e-33784a522df8-kube-api-access-zp76b\") on node \"crc\" DevicePath \"\"" Feb 27 18:03:15 crc kubenswrapper[4752]: I0227 18:03:15.468627 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536922-wtljg" event={"ID":"1fabbabe-cc23-4da0-9b3e-33784a522df8","Type":"ContainerDied","Data":"cc140d41c5c17f01f5e547ea5841f9bcf01e43e9bb82ee57ad9c03f1584e3da2"} Feb 27 18:03:15 crc kubenswrapper[4752]: I0227 18:03:15.468966 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc140d41c5c17f01f5e547ea5841f9bcf01e43e9bb82ee57ad9c03f1584e3da2" Feb 27 18:03:15 crc kubenswrapper[4752]: I0227 18:03:15.468685 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536922-wtljg" Feb 27 18:03:15 crc kubenswrapper[4752]: I0227 18:03:15.830062 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536916-h8kqw"] Feb 27 18:03:15 crc kubenswrapper[4752]: I0227 18:03:15.838759 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536916-h8kqw"] Feb 27 18:03:16 crc kubenswrapper[4752]: E0227 18:03:16.910433 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f\\\"\"" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" podUID="b3272ad9-e002-48b6-92b4-27da6186f45a" Feb 27 18:03:16 crc kubenswrapper[4752]: I0227 18:03:16.915243 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef8bcb56-ca08-4640-895c-b122c4ad2ad3" path="/var/lib/kubelet/pods/ef8bcb56-ca08-4640-895c-b122c4ad2ad3/volumes" Feb 27 18:03:20 crc kubenswrapper[4752]: I0227 18:03:20.913030 4752 scope.go:117] "RemoveContainer" containerID="2d15bd23de76affad936b2b453b80049294c778d3446231b8cd3ac1c0f04c31a" Feb 27 18:03:20 crc kubenswrapper[4752]: E0227 18:03:20.913855 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cm8wb_openshift-machine-config-operator(53ce186c-640f-4ade-94e1-587c1440fe87)\"" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" Feb 27 18:03:24 crc kubenswrapper[4752]: E0227 18:03:24.909218 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-8ql8m" podUID="d92b437a-52a3-4c88-8ae9-a86b0e60fe5a" Feb 27 18:03:30 crc kubenswrapper[4752]: E0227 18:03:30.917475 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f\\\"\"" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" podUID="b3272ad9-e002-48b6-92b4-27da6186f45a" Feb 27 18:03:35 crc kubenswrapper[4752]: I0227 18:03:35.906581 4752 scope.go:117] "RemoveContainer" containerID="2d15bd23de76affad936b2b453b80049294c778d3446231b8cd3ac1c0f04c31a" Feb 27 18:03:35 crc kubenswrapper[4752]: E0227 18:03:35.907191 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cm8wb_openshift-machine-config-operator(53ce186c-640f-4ade-94e1-587c1440fe87)\"" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" Feb 27 18:03:38 crc kubenswrapper[4752]: E0227 18:03:38.490366 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/community-operator-index@sha256=886ecdbcb5b8f90338063f6476072fab73c2a9a65b9f2b3835b7bd01c69794c1/signature-2: status 500 (Internal Server Error)" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 27 18:03:38 crc kubenswrapper[4752]: E0227 18:03:38.490607 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2z4l5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-8ql8m_openshift-marketplace(d92b437a-52a3-4c88-8ae9-a86b0e60fe5a): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/community-operator-index@sha256=886ecdbcb5b8f90338063f6476072fab73c2a9a65b9f2b3835b7bd01c69794c1/signature-2: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 18:03:38 crc kubenswrapper[4752]: E0227 18:03:38.491910 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/community-operator-index@sha256=886ecdbcb5b8f90338063f6476072fab73c2a9a65b9f2b3835b7bd01c69794c1/signature-2: status 500 (Internal Server Error)\"" pod="openshift-marketplace/community-operators-8ql8m" podUID="d92b437a-52a3-4c88-8ae9-a86b0e60fe5a" Feb 27 18:03:44 crc kubenswrapper[4752]: E0227 18:03:44.909007 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-metallb-operator-bundle@sha256:2d751ef9609ce7a75d216ef5bee7417f143f8584d795cb8bf9f5df6f7e99c62f\\\"\"" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" podUID="b3272ad9-e002-48b6-92b4-27da6186f45a" Feb 27 18:03:46 crc kubenswrapper[4752]: I0227 18:03:46.907000 4752 scope.go:117] "RemoveContainer" containerID="2d15bd23de76affad936b2b453b80049294c778d3446231b8cd3ac1c0f04c31a" Feb 27 18:03:46 crc kubenswrapper[4752]: E0227 18:03:46.907237 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cm8wb_openshift-machine-config-operator(53ce186c-640f-4ade-94e1-587c1440fe87)\"" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" Feb 27 18:03:48 crc kubenswrapper[4752]: E0227 18:03:48.911779 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-8ql8m" podUID="d92b437a-52a3-4c88-8ae9-a86b0e60fe5a" Feb 27 18:03:55 crc kubenswrapper[4752]: I0227 18:03:55.491230 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f98lg"] Feb 27 18:03:55 crc kubenswrapper[4752]: E0227 18:03:55.492373 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fabbabe-cc23-4da0-9b3e-33784a522df8" containerName="oc" Feb 27 18:03:55 crc kubenswrapper[4752]: I0227 18:03:55.492397 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fabbabe-cc23-4da0-9b3e-33784a522df8" containerName="oc" Feb 27 18:03:55 crc kubenswrapper[4752]: I0227 18:03:55.492613 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fabbabe-cc23-4da0-9b3e-33784a522df8" containerName="oc" Feb 27 18:03:55 crc kubenswrapper[4752]: I0227 18:03:55.494084 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f98lg" Feb 27 18:03:55 crc kubenswrapper[4752]: I0227 18:03:55.514636 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f98lg"] Feb 27 18:03:55 crc kubenswrapper[4752]: I0227 18:03:55.541014 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3134fb27-d7aa-44ce-8d34-440774dd286b-catalog-content\") pod \"redhat-marketplace-f98lg\" (UID: \"3134fb27-d7aa-44ce-8d34-440774dd286b\") " pod="openshift-marketplace/redhat-marketplace-f98lg" Feb 27 18:03:55 crc kubenswrapper[4752]: I0227 18:03:55.541164 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nc7x\" (UniqueName: \"kubernetes.io/projected/3134fb27-d7aa-44ce-8d34-440774dd286b-kube-api-access-6nc7x\") pod \"redhat-marketplace-f98lg\" (UID: \"3134fb27-d7aa-44ce-8d34-440774dd286b\") " pod="openshift-marketplace/redhat-marketplace-f98lg" Feb 27 18:03:55 crc kubenswrapper[4752]: I0227 18:03:55.541233 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3134fb27-d7aa-44ce-8d34-440774dd286b-utilities\") pod \"redhat-marketplace-f98lg\" (UID: \"3134fb27-d7aa-44ce-8d34-440774dd286b\") " pod="openshift-marketplace/redhat-marketplace-f98lg" Feb 27 18:03:55 crc kubenswrapper[4752]: I0227 18:03:55.642111 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nc7x\" (UniqueName: \"kubernetes.io/projected/3134fb27-d7aa-44ce-8d34-440774dd286b-kube-api-access-6nc7x\") pod \"redhat-marketplace-f98lg\" (UID: \"3134fb27-d7aa-44ce-8d34-440774dd286b\") " pod="openshift-marketplace/redhat-marketplace-f98lg" Feb 27 18:03:55 crc kubenswrapper[4752]: I0227 18:03:55.642471 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3134fb27-d7aa-44ce-8d34-440774dd286b-utilities\") pod \"redhat-marketplace-f98lg\" (UID: \"3134fb27-d7aa-44ce-8d34-440774dd286b\") " pod="openshift-marketplace/redhat-marketplace-f98lg" Feb 27 18:03:55 crc kubenswrapper[4752]: I0227 18:03:55.642685 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3134fb27-d7aa-44ce-8d34-440774dd286b-catalog-content\") pod \"redhat-marketplace-f98lg\" (UID: \"3134fb27-d7aa-44ce-8d34-440774dd286b\") " pod="openshift-marketplace/redhat-marketplace-f98lg" Feb 27 18:03:55 crc kubenswrapper[4752]: I0227 18:03:55.642942 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3134fb27-d7aa-44ce-8d34-440774dd286b-utilities\") pod \"redhat-marketplace-f98lg\" (UID: \"3134fb27-d7aa-44ce-8d34-440774dd286b\") " pod="openshift-marketplace/redhat-marketplace-f98lg" Feb 27 18:03:55 crc kubenswrapper[4752]: I0227 18:03:55.643054 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3134fb27-d7aa-44ce-8d34-440774dd286b-catalog-content\") pod \"redhat-marketplace-f98lg\" (UID: \"3134fb27-d7aa-44ce-8d34-440774dd286b\") " pod="openshift-marketplace/redhat-marketplace-f98lg" Feb 27 18:03:55 crc kubenswrapper[4752]: I0227 18:03:55.666955 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nc7x\" (UniqueName: \"kubernetes.io/projected/3134fb27-d7aa-44ce-8d34-440774dd286b-kube-api-access-6nc7x\") pod \"redhat-marketplace-f98lg\" (UID: \"3134fb27-d7aa-44ce-8d34-440774dd286b\") " pod="openshift-marketplace/redhat-marketplace-f98lg" Feb 27 18:03:55 crc kubenswrapper[4752]: I0227 18:03:55.825886 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f98lg" Feb 27 18:03:56 crc kubenswrapper[4752]: I0227 18:03:56.286454 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f98lg"] Feb 27 18:03:56 crc kubenswrapper[4752]: I0227 18:03:56.784274 4752 generic.go:334] "Generic (PLEG): container finished" podID="3134fb27-d7aa-44ce-8d34-440774dd286b" containerID="52b60edd1ba56ecb269a9d3ed85919ba077271f8d53668dac45d96ad3499603e" exitCode=0 Feb 27 18:03:56 crc kubenswrapper[4752]: I0227 18:03:56.784330 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f98lg" event={"ID":"3134fb27-d7aa-44ce-8d34-440774dd286b","Type":"ContainerDied","Data":"52b60edd1ba56ecb269a9d3ed85919ba077271f8d53668dac45d96ad3499603e"} Feb 27 18:03:56 crc kubenswrapper[4752]: I0227 18:03:56.784369 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f98lg" event={"ID":"3134fb27-d7aa-44ce-8d34-440774dd286b","Type":"ContainerStarted","Data":"5a3c8dba9686566c68f83b1da28e509ffcea86be73dee7d934992e8e5b023081"} Feb 27 18:03:57 crc kubenswrapper[4752]: E0227 18:03:57.551047 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error)" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 27 18:03:57 crc kubenswrapper[4752]: E0227 18:03:57.551357 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6nc7x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-f98lg_openshift-marketplace(3134fb27-d7aa-44ce-8d34-440774dd286b): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 18:03:57 crc kubenswrapper[4752]: E0227 18:03:57.552663 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error)\"" pod="openshift-marketplace/redhat-marketplace-f98lg" podUID="3134fb27-d7aa-44ce-8d34-440774dd286b" Feb 27 18:03:57 crc kubenswrapper[4752]: E0227 18:03:57.794120 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-f98lg" podUID="3134fb27-d7aa-44ce-8d34-440774dd286b" Feb 27 18:03:58 crc kubenswrapper[4752]: I0227 18:03:58.907303 4752 scope.go:117] "RemoveContainer" containerID="2d15bd23de76affad936b2b453b80049294c778d3446231b8cd3ac1c0f04c31a" Feb 27 18:03:58 crc kubenswrapper[4752]: E0227 18:03:58.909081 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cm8wb_openshift-machine-config-operator(53ce186c-640f-4ade-94e1-587c1440fe87)\"" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" Feb 27 18:03:59 crc kubenswrapper[4752]: I0227 18:03:59.810695 4752 generic.go:334] "Generic (PLEG): container finished" podID="b3272ad9-e002-48b6-92b4-27da6186f45a" containerID="a6abdbf9bc25c3fe9f6e831a809a1502eab4e92de1309c1e867ee55e5899a8ed" exitCode=0 Feb 27 18:03:59 crc kubenswrapper[4752]: I0227 18:03:59.811095 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" event={"ID":"b3272ad9-e002-48b6-92b4-27da6186f45a","Type":"ContainerDied","Data":"a6abdbf9bc25c3fe9f6e831a809a1502eab4e92de1309c1e867ee55e5899a8ed"} Feb 27 18:04:00 crc kubenswrapper[4752]: I0227 18:04:00.158377 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536924-2vf87"] Feb 27 18:04:00 crc kubenswrapper[4752]: I0227 18:04:00.160044 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536924-2vf87" Feb 27 18:04:00 crc kubenswrapper[4752]: I0227 18:04:00.163428 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536924-2vf87"] Feb 27 18:04:00 crc kubenswrapper[4752]: I0227 18:04:00.163720 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wtt6d" Feb 27 18:04:00 crc kubenswrapper[4752]: I0227 18:04:00.165576 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 18:04:00 crc kubenswrapper[4752]: I0227 18:04:00.165912 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 18:04:00 crc kubenswrapper[4752]: I0227 18:04:00.242297 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76h2g\" (UniqueName: \"kubernetes.io/projected/cf1f77d3-d6cf-4926-b04c-31d88fffeba1-kube-api-access-76h2g\") pod \"auto-csr-approver-29536924-2vf87\" (UID: \"cf1f77d3-d6cf-4926-b04c-31d88fffeba1\") " pod="openshift-infra/auto-csr-approver-29536924-2vf87" Feb 27 18:04:00 crc kubenswrapper[4752]: I0227 18:04:00.343703 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76h2g\" (UniqueName: \"kubernetes.io/projected/cf1f77d3-d6cf-4926-b04c-31d88fffeba1-kube-api-access-76h2g\") pod \"auto-csr-approver-29536924-2vf87\" (UID: \"cf1f77d3-d6cf-4926-b04c-31d88fffeba1\") " pod="openshift-infra/auto-csr-approver-29536924-2vf87" Feb 27 18:04:00 crc kubenswrapper[4752]: I0227 18:04:00.378346 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76h2g\" (UniqueName: \"kubernetes.io/projected/cf1f77d3-d6cf-4926-b04c-31d88fffeba1-kube-api-access-76h2g\") pod \"auto-csr-approver-29536924-2vf87\" (UID: \"cf1f77d3-d6cf-4926-b04c-31d88fffeba1\") " pod="openshift-infra/auto-csr-approver-29536924-2vf87" Feb 27 18:04:00 crc kubenswrapper[4752]: I0227 18:04:00.619425 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536924-2vf87" Feb 27 18:04:00 crc kubenswrapper[4752]: I0227 18:04:00.821622 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536924-2vf87"] Feb 27 18:04:00 crc kubenswrapper[4752]: I0227 18:04:00.822019 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" event={"ID":"b3272ad9-e002-48b6-92b4-27da6186f45a","Type":"ContainerDied","Data":"41052858800d605e338f88bb84fa12c6c6911e76c9bc3360c87546485eef1b6c"} Feb 27 18:04:00 crc kubenswrapper[4752]: I0227 18:04:00.821743 4752 generic.go:334] "Generic (PLEG): container finished" podID="b3272ad9-e002-48b6-92b4-27da6186f45a" containerID="41052858800d605e338f88bb84fa12c6c6911e76c9bc3360c87546485eef1b6c" exitCode=0 Feb 27 18:04:00 crc kubenswrapper[4752]: W0227 18:04:00.831633 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf1f77d3_d6cf_4926_b04c_31d88fffeba1.slice/crio-d82df7e0307e76584899b31b8f53bc28db6a375a999a506673cb6257a6240388 WatchSource:0}: Error finding container d82df7e0307e76584899b31b8f53bc28db6a375a999a506673cb6257a6240388: Status 404 returned error can't find the container with id d82df7e0307e76584899b31b8f53bc28db6a375a999a506673cb6257a6240388 Feb 27 18:04:01 crc kubenswrapper[4752]: I0227 18:04:01.789014 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-snmz2/must-gather-k86nf"] Feb 27 18:04:01 crc kubenswrapper[4752]: I0227 18:04:01.791366 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-snmz2/must-gather-k86nf" Feb 27 18:04:01 crc kubenswrapper[4752]: I0227 18:04:01.793931 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-snmz2"/"kube-root-ca.crt" Feb 27 18:04:01 crc kubenswrapper[4752]: I0227 18:04:01.794189 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-snmz2"/"openshift-service-ca.crt" Feb 27 18:04:01 crc kubenswrapper[4752]: I0227 18:04:01.813373 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-snmz2/must-gather-k86nf"] Feb 27 18:04:01 crc kubenswrapper[4752]: I0227 18:04:01.828810 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536924-2vf87" event={"ID":"cf1f77d3-d6cf-4926-b04c-31d88fffeba1","Type":"ContainerStarted","Data":"d82df7e0307e76584899b31b8f53bc28db6a375a999a506673cb6257a6240388"} Feb 27 18:04:01 crc kubenswrapper[4752]: I0227 18:04:01.868836 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/98d4bc78-9907-4c63-8aa5-77cbe7c87733-must-gather-output\") pod \"must-gather-k86nf\" (UID: \"98d4bc78-9907-4c63-8aa5-77cbe7c87733\") " pod="openshift-must-gather-snmz2/must-gather-k86nf" Feb 27 18:04:01 crc kubenswrapper[4752]: I0227 18:04:01.868968 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh8sk\" (UniqueName: \"kubernetes.io/projected/98d4bc78-9907-4c63-8aa5-77cbe7c87733-kube-api-access-nh8sk\") pod \"must-gather-k86nf\" (UID: \"98d4bc78-9907-4c63-8aa5-77cbe7c87733\") " pod="openshift-must-gather-snmz2/must-gather-k86nf" Feb 27 18:04:01 crc kubenswrapper[4752]: I0227 18:04:01.970645 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/98d4bc78-9907-4c63-8aa5-77cbe7c87733-must-gather-output\") pod \"must-gather-k86nf\" (UID: \"98d4bc78-9907-4c63-8aa5-77cbe7c87733\") " pod="openshift-must-gather-snmz2/must-gather-k86nf" Feb 27 18:04:01 crc kubenswrapper[4752]: I0227 18:04:01.972844 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh8sk\" (UniqueName: \"kubernetes.io/projected/98d4bc78-9907-4c63-8aa5-77cbe7c87733-kube-api-access-nh8sk\") pod \"must-gather-k86nf\" (UID: \"98d4bc78-9907-4c63-8aa5-77cbe7c87733\") " pod="openshift-must-gather-snmz2/must-gather-k86nf" Feb 27 18:04:01 crc kubenswrapper[4752]: I0227 18:04:01.974475 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/98d4bc78-9907-4c63-8aa5-77cbe7c87733-must-gather-output\") pod \"must-gather-k86nf\" (UID: \"98d4bc78-9907-4c63-8aa5-77cbe7c87733\") " pod="openshift-must-gather-snmz2/must-gather-k86nf" Feb 27 18:04:01 crc kubenswrapper[4752]: I0227 18:04:01.993800 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh8sk\" (UniqueName: \"kubernetes.io/projected/98d4bc78-9907-4c63-8aa5-77cbe7c87733-kube-api-access-nh8sk\") pod \"must-gather-k86nf\" (UID: \"98d4bc78-9907-4c63-8aa5-77cbe7c87733\") " pod="openshift-must-gather-snmz2/must-gather-k86nf" Feb 27 18:04:02 crc kubenswrapper[4752]: I0227 18:04:02.090366 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" Feb 27 18:04:02 crc kubenswrapper[4752]: I0227 18:04:02.111496 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-snmz2/must-gather-k86nf" Feb 27 18:04:02 crc kubenswrapper[4752]: I0227 18:04:02.175384 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7bxj\" (UniqueName: \"kubernetes.io/projected/b3272ad9-e002-48b6-92b4-27da6186f45a-kube-api-access-t7bxj\") pod \"b3272ad9-e002-48b6-92b4-27da6186f45a\" (UID: \"b3272ad9-e002-48b6-92b4-27da6186f45a\") " Feb 27 18:04:02 crc kubenswrapper[4752]: I0227 18:04:02.175437 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b3272ad9-e002-48b6-92b4-27da6186f45a-bundle\") pod \"b3272ad9-e002-48b6-92b4-27da6186f45a\" (UID: \"b3272ad9-e002-48b6-92b4-27da6186f45a\") " Feb 27 18:04:02 crc kubenswrapper[4752]: I0227 18:04:02.175481 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b3272ad9-e002-48b6-92b4-27da6186f45a-util\") pod \"b3272ad9-e002-48b6-92b4-27da6186f45a\" (UID: \"b3272ad9-e002-48b6-92b4-27da6186f45a\") " Feb 27 18:04:02 crc kubenswrapper[4752]: I0227 18:04:02.176411 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3272ad9-e002-48b6-92b4-27da6186f45a-bundle" (OuterVolumeSpecName: "bundle") pod "b3272ad9-e002-48b6-92b4-27da6186f45a" (UID: "b3272ad9-e002-48b6-92b4-27da6186f45a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 18:04:02 crc kubenswrapper[4752]: I0227 18:04:02.179845 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3272ad9-e002-48b6-92b4-27da6186f45a-kube-api-access-t7bxj" (OuterVolumeSpecName: "kube-api-access-t7bxj") pod "b3272ad9-e002-48b6-92b4-27da6186f45a" (UID: "b3272ad9-e002-48b6-92b4-27da6186f45a"). InnerVolumeSpecName "kube-api-access-t7bxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 18:04:02 crc kubenswrapper[4752]: I0227 18:04:02.189059 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3272ad9-e002-48b6-92b4-27da6186f45a-util" (OuterVolumeSpecName: "util") pod "b3272ad9-e002-48b6-92b4-27da6186f45a" (UID: "b3272ad9-e002-48b6-92b4-27da6186f45a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 18:04:02 crc kubenswrapper[4752]: I0227 18:04:02.277065 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7bxj\" (UniqueName: \"kubernetes.io/projected/b3272ad9-e002-48b6-92b4-27da6186f45a-kube-api-access-t7bxj\") on node \"crc\" DevicePath \"\"" Feb 27 18:04:02 crc kubenswrapper[4752]: I0227 18:04:02.277093 4752 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b3272ad9-e002-48b6-92b4-27da6186f45a-bundle\") on node \"crc\" DevicePath \"\"" Feb 27 18:04:02 crc kubenswrapper[4752]: I0227 18:04:02.277102 4752 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b3272ad9-e002-48b6-92b4-27da6186f45a-util\") on node \"crc\" DevicePath \"\"" Feb 27 18:04:02 crc kubenswrapper[4752]: W0227 18:04:02.314600 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98d4bc78_9907_4c63_8aa5_77cbe7c87733.slice/crio-7d376da7df73e84a4aa401bcf05dc388f651ead96889cb8ee92b650d04a25ed9 WatchSource:0}: Error finding container 7d376da7df73e84a4aa401bcf05dc388f651ead96889cb8ee92b650d04a25ed9: Status 404 returned error can't find the container with id 7d376da7df73e84a4aa401bcf05dc388f651ead96889cb8ee92b650d04a25ed9 Feb 27 18:04:02 crc kubenswrapper[4752]: I0227 18:04:02.323385 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-snmz2/must-gather-k86nf"] Feb 27 18:04:02 crc kubenswrapper[4752]: I0227 18:04:02.839325 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" event={"ID":"b3272ad9-e002-48b6-92b4-27da6186f45a","Type":"ContainerDied","Data":"12b32338c41b39322af859247419a6801e96df789d8c6be891d976d04d0cd6a8"} Feb 27 18:04:02 crc kubenswrapper[4752]: I0227 18:04:02.839688 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12b32338c41b39322af859247419a6801e96df789d8c6be891d976d04d0cd6a8" Feb 27 18:04:02 crc kubenswrapper[4752]: I0227 18:04:02.839379 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk" Feb 27 18:04:02 crc kubenswrapper[4752]: I0227 18:04:02.840651 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-snmz2/must-gather-k86nf" event={"ID":"98d4bc78-9907-4c63-8aa5-77cbe7c87733","Type":"ContainerStarted","Data":"7d376da7df73e84a4aa401bcf05dc388f651ead96889cb8ee92b650d04a25ed9"} Feb 27 18:04:02 crc kubenswrapper[4752]: E0227 18:04:02.909473 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-8ql8m" podUID="d92b437a-52a3-4c88-8ae9-a86b0e60fe5a" Feb 27 18:04:03 crc kubenswrapper[4752]: I0227 18:04:03.715675 4752 scope.go:117] "RemoveContainer" containerID="d3306d8c3a041cdf001cb44eef0e59d3d5f852ca2fcfc3fdc7d8b88bcdcf6883" Feb 27 18:04:09 crc kubenswrapper[4752]: I0227 18:04:09.899354 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-snmz2/must-gather-k86nf" event={"ID":"98d4bc78-9907-4c63-8aa5-77cbe7c87733","Type":"ContainerStarted","Data":"741b4f7e122905403da4108091477c3422a11336be8b0e8c09b35250cd3ba3ad"} Feb 27 18:04:10 crc kubenswrapper[4752]: I0227 18:04:10.910327 4752 scope.go:117] "RemoveContainer" containerID="2d15bd23de76affad936b2b453b80049294c778d3446231b8cd3ac1c0f04c31a" Feb 27 18:04:10 crc kubenswrapper[4752]: E0227 18:04:10.910998 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cm8wb_openshift-machine-config-operator(53ce186c-640f-4ade-94e1-587c1440fe87)\"" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" Feb 27 18:04:10 crc kubenswrapper[4752]: I0227 18:04:10.917885 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-snmz2/must-gather-k86nf" event={"ID":"98d4bc78-9907-4c63-8aa5-77cbe7c87733","Type":"ContainerStarted","Data":"ea3ee97c1191d332144e2f48736bb7b4efc9363035883dc74cd2c8b948148c45"} Feb 27 18:04:10 crc kubenswrapper[4752]: I0227 18:04:10.980643 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-snmz2/must-gather-k86nf" podStartSLOduration=2.808459818 podStartE2EDuration="9.980612788s" podCreationTimestamp="2026-02-27 18:04:01 +0000 UTC" firstStartedPulling="2026-02-27 18:04:02.317002886 +0000 UTC m=+1742.223819727" lastFinishedPulling="2026-02-27 18:04:09.489155836 +0000 UTC m=+1749.395972697" observedRunningTime="2026-02-27 18:04:10.976866526 +0000 UTC m=+1750.883683377" watchObservedRunningTime="2026-02-27 18:04:10.980612788 +0000 UTC m=+1750.887429649" Feb 27 18:04:11 crc kubenswrapper[4752]: I0227 18:04:11.712855 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds"] Feb 27 18:04:11 crc kubenswrapper[4752]: E0227 18:04:11.713079 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3272ad9-e002-48b6-92b4-27da6186f45a" containerName="extract" Feb 27 18:04:11 crc kubenswrapper[4752]: I0227 18:04:11.713091 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3272ad9-e002-48b6-92b4-27da6186f45a" containerName="extract" Feb 27 18:04:11 crc kubenswrapper[4752]: E0227 18:04:11.713102 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3272ad9-e002-48b6-92b4-27da6186f45a" containerName="util" Feb 27 18:04:11 crc kubenswrapper[4752]: I0227 18:04:11.713108 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3272ad9-e002-48b6-92b4-27da6186f45a" containerName="util" Feb 27 18:04:11 crc kubenswrapper[4752]: E0227 18:04:11.713122 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3272ad9-e002-48b6-92b4-27da6186f45a" containerName="pull" Feb 27 18:04:11 crc kubenswrapper[4752]: I0227 18:04:11.713129 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3272ad9-e002-48b6-92b4-27da6186f45a" containerName="pull" Feb 27 18:04:11 crc kubenswrapper[4752]: I0227 18:04:11.713230 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3272ad9-e002-48b6-92b4-27da6186f45a" containerName="extract" Feb 27 18:04:11 crc kubenswrapper[4752]: I0227 18:04:11.713586 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" Feb 27 18:04:11 crc kubenswrapper[4752]: I0227 18:04:11.718230 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 27 18:04:11 crc kubenswrapper[4752]: I0227 18:04:11.718294 4752 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 27 18:04:11 crc kubenswrapper[4752]: I0227 18:04:11.718340 4752 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-smrmr" Feb 27 18:04:11 crc kubenswrapper[4752]: I0227 18:04:11.718384 4752 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 27 18:04:11 crc kubenswrapper[4752]: I0227 18:04:11.718868 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 27 18:04:11 crc kubenswrapper[4752]: I0227 18:04:11.732722 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds"] Feb 27 18:04:11 crc kubenswrapper[4752]: I0227 18:04:11.809609 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0-webhook-cert\") pod \"metallb-operator-controller-manager-55b4885b68-sg4ds\" (UID: \"5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0\") " pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" Feb 27 18:04:11 crc kubenswrapper[4752]: I0227 18:04:11.809656 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0-apiservice-cert\") pod \"metallb-operator-controller-manager-55b4885b68-sg4ds\" (UID: \"5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0\") " pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" Feb 27 18:04:11 crc kubenswrapper[4752]: I0227 18:04:11.809685 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxltq\" (UniqueName: \"kubernetes.io/projected/5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0-kube-api-access-hxltq\") pod \"metallb-operator-controller-manager-55b4885b68-sg4ds\" (UID: \"5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0\") " pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" Feb 27 18:04:11 crc kubenswrapper[4752]: I0227 18:04:11.910718 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxltq\" (UniqueName: \"kubernetes.io/projected/5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0-kube-api-access-hxltq\") pod \"metallb-operator-controller-manager-55b4885b68-sg4ds\" (UID: \"5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0\") " pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" Feb 27 18:04:11 crc kubenswrapper[4752]: I0227 18:04:11.910835 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0-webhook-cert\") pod \"metallb-operator-controller-manager-55b4885b68-sg4ds\" (UID: \"5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0\") " pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" Feb 27 18:04:11 crc kubenswrapper[4752]: I0227 18:04:11.910856 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0-apiservice-cert\") pod \"metallb-operator-controller-manager-55b4885b68-sg4ds\" (UID: \"5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0\") " pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" Feb 27 18:04:11 crc kubenswrapper[4752]: I0227 18:04:11.919810 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0-webhook-cert\") pod \"metallb-operator-controller-manager-55b4885b68-sg4ds\" (UID: \"5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0\") " pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" Feb 27 18:04:11 crc kubenswrapper[4752]: I0227 18:04:11.925623 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0-apiservice-cert\") pod \"metallb-operator-controller-manager-55b4885b68-sg4ds\" (UID: \"5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0\") " pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" Feb 27 18:04:11 crc kubenswrapper[4752]: I0227 18:04:11.933442 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxltq\" (UniqueName: \"kubernetes.io/projected/5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0-kube-api-access-hxltq\") pod \"metallb-operator-controller-manager-55b4885b68-sg4ds\" (UID: \"5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0\") " pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" Feb 27 18:04:11 crc kubenswrapper[4752]: E0227 18:04:11.941826 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error)" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 27 18:04:11 crc kubenswrapper[4752]: E0227 18:04:11.941973 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6nc7x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-f98lg_openshift-marketplace(3134fb27-d7aa-44ce-8d34-440774dd286b): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 18:04:11 crc kubenswrapper[4752]: E0227 18:04:11.943999 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error)\"" pod="openshift-marketplace/redhat-marketplace-f98lg" podUID="3134fb27-d7aa-44ce-8d34-440774dd286b" Feb 27 18:04:12 crc kubenswrapper[4752]: I0227 18:04:12.030599 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" Feb 27 18:04:12 crc kubenswrapper[4752]: I0227 18:04:12.158640 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn"] Feb 27 18:04:12 crc kubenswrapper[4752]: I0227 18:04:12.159309 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" Feb 27 18:04:12 crc kubenswrapper[4752]: I0227 18:04:12.163693 4752 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-f76qz" Feb 27 18:04:12 crc kubenswrapper[4752]: I0227 18:04:12.163910 4752 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 27 18:04:12 crc kubenswrapper[4752]: I0227 18:04:12.164046 4752 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 27 18:04:12 crc kubenswrapper[4752]: I0227 18:04:12.191450 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn"] Feb 27 18:04:12 crc kubenswrapper[4752]: I0227 18:04:12.213990 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d07bbb76-6065-4740-a3d5-45d511110e25-apiservice-cert\") pod \"metallb-operator-webhook-server-7f65d79b55-6j6sn\" (UID: \"d07bbb76-6065-4740-a3d5-45d511110e25\") " pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" Feb 27 18:04:12 crc kubenswrapper[4752]: I0227 18:04:12.214056 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gqbs\" (UniqueName: \"kubernetes.io/projected/d07bbb76-6065-4740-a3d5-45d511110e25-kube-api-access-2gqbs\") pod \"metallb-operator-webhook-server-7f65d79b55-6j6sn\" (UID: \"d07bbb76-6065-4740-a3d5-45d511110e25\") " pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" Feb 27 18:04:12 crc kubenswrapper[4752]: I0227 18:04:12.214079 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d07bbb76-6065-4740-a3d5-45d511110e25-webhook-cert\") pod \"metallb-operator-webhook-server-7f65d79b55-6j6sn\" (UID: \"d07bbb76-6065-4740-a3d5-45d511110e25\") " pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" Feb 27 18:04:12 crc kubenswrapper[4752]: I0227 18:04:12.311442 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds"] Feb 27 18:04:12 crc kubenswrapper[4752]: I0227 18:04:12.315065 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gqbs\" (UniqueName: \"kubernetes.io/projected/d07bbb76-6065-4740-a3d5-45d511110e25-kube-api-access-2gqbs\") pod \"metallb-operator-webhook-server-7f65d79b55-6j6sn\" (UID: \"d07bbb76-6065-4740-a3d5-45d511110e25\") " pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" Feb 27 18:04:12 crc kubenswrapper[4752]: I0227 18:04:12.315100 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d07bbb76-6065-4740-a3d5-45d511110e25-webhook-cert\") pod \"metallb-operator-webhook-server-7f65d79b55-6j6sn\" (UID: \"d07bbb76-6065-4740-a3d5-45d511110e25\") " pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" Feb 27 18:04:12 crc kubenswrapper[4752]: I0227 18:04:12.315191 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d07bbb76-6065-4740-a3d5-45d511110e25-apiservice-cert\") pod \"metallb-operator-webhook-server-7f65d79b55-6j6sn\" (UID: \"d07bbb76-6065-4740-a3d5-45d511110e25\") " pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" Feb 27 18:04:12 crc kubenswrapper[4752]: I0227 18:04:12.320981 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d07bbb76-6065-4740-a3d5-45d511110e25-apiservice-cert\") pod \"metallb-operator-webhook-server-7f65d79b55-6j6sn\" (UID: \"d07bbb76-6065-4740-a3d5-45d511110e25\") " pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" Feb 27 18:04:12 crc kubenswrapper[4752]: I0227 18:04:12.324801 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d07bbb76-6065-4740-a3d5-45d511110e25-webhook-cert\") pod \"metallb-operator-webhook-server-7f65d79b55-6j6sn\" (UID: \"d07bbb76-6065-4740-a3d5-45d511110e25\") " pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" Feb 27 18:04:12 crc kubenswrapper[4752]: I0227 18:04:12.332422 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gqbs\" (UniqueName: \"kubernetes.io/projected/d07bbb76-6065-4740-a3d5-45d511110e25-kube-api-access-2gqbs\") pod \"metallb-operator-webhook-server-7f65d79b55-6j6sn\" (UID: \"d07bbb76-6065-4740-a3d5-45d511110e25\") " pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" Feb 27 18:04:12 crc kubenswrapper[4752]: I0227 18:04:12.525739 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" Feb 27 18:04:12 crc kubenswrapper[4752]: I0227 18:04:12.814222 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn"] Feb 27 18:04:12 crc kubenswrapper[4752]: W0227 18:04:12.820666 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd07bbb76_6065_4740_a3d5_45d511110e25.slice/crio-b6ffbd62c8d0eff0ee8ffa72e033538308baeee3c3bd81b9d1d05d6ee0390a24 WatchSource:0}: Error finding container b6ffbd62c8d0eff0ee8ffa72e033538308baeee3c3bd81b9d1d05d6ee0390a24: Status 404 returned error can't find the container with id b6ffbd62c8d0eff0ee8ffa72e033538308baeee3c3bd81b9d1d05d6ee0390a24 Feb 27 18:04:12 crc kubenswrapper[4752]: I0227 18:04:12.927724 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" event={"ID":"d07bbb76-6065-4740-a3d5-45d511110e25","Type":"ContainerStarted","Data":"b6ffbd62c8d0eff0ee8ffa72e033538308baeee3c3bd81b9d1d05d6ee0390a24"} Feb 27 18:04:12 crc kubenswrapper[4752]: I0227 18:04:12.928926 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" event={"ID":"5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0","Type":"ContainerStarted","Data":"a78753a139c67528c3002ce44ddd22bb413d4ce27cb20ce97454ffe73f50a765"} Feb 27 18:04:14 crc kubenswrapper[4752]: E0227 18:04:14.912498 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-8ql8m" podUID="d92b437a-52a3-4c88-8ae9-a86b0e60fe5a" Feb 27 18:04:15 crc kubenswrapper[4752]: E0227 18:04:15.000101 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/metallb-rhel9-operator@sha256=5362b055219db524ad33db3122acfcf8b562f9fc2386ca5d1176cb2e4a29ec82/signature-11: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:0f668226ec5fdc1726e9df3bb807b172040b59313117c8cbed8ade8e730a2225" Feb 27 18:04:15 crc kubenswrapper[4752]: E0227 18:04:15.000368 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:0f668226ec5fdc1726e9df3bb807b172040b59313117c8cbed8ade8e730a2225,Command:[/manager],Args:[--enable-leader-election --disable-cert-rotation=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:webhook-server,HostPort:0,ContainerPort:9443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:SPEAKER_IMAGE,Value:registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9,ValueFrom:nil,},EnvVar{Name:CONTROLLER_IMAGE,Value:registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9,ValueFrom:nil,},EnvVar{Name:FRR_IMAGE,Value:registry.redhat.io/openshift4/frr-rhel9@sha256:d7e76e936159ed04e779a66d421cc3ecc6c82409e8eed924112d9174c3d6aad9,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:registry.redhat.io/openshift4/ose-kube-rbac-proxy-rhel9@sha256:787be45b5241419b6819676d43325a9030c0e16441918e4a33a44f0380d6b902,ValueFrom:nil,},EnvVar{Name:DEPLOY_KUBE_RBAC_PROXIES,Value:true,ValueFrom:nil,},EnvVar{Name:FRRK8S_IMAGE,Value:registry.redhat.io/openshift4/frr-rhel9@sha256:d7e76e936159ed04e779a66d421cc3ecc6c82409e8eed924112d9174c3d6aad9,ValueFrom:nil,},EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:DEPLOY_PODMONITORS,Value:false,ValueFrom:nil,},EnvVar{Name:DEPLOY_SERVICEMONITORS,Value:true,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOK,Value:true,ValueFrom:nil,},EnvVar{Name:ENABLE_OPERATOR_WEBHOOK,Value:true,ValueFrom:nil,},EnvVar{Name:METRICS_PORT,Value:29150,ValueFrom:nil,},EnvVar{Name:HTTPS_METRICS_PORT,Value:9120,ValueFrom:nil,},EnvVar{Name:FRR_METRICS_PORT,Value:29151,ValueFrom:nil,},EnvVar{Name:FRR_HTTPS_METRICS_PORT,Value:9121,ValueFrom:nil,},EnvVar{Name:MEMBER_LIST_BIND_PORT,Value:9122,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:metallb-operator.v4.18.0-202602140741,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hxltq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000700000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod metallb-operator-controller-manager-55b4885b68-sg4ds_metallb-system(5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/metallb-rhel9-operator@sha256=5362b055219db524ad33db3122acfcf8b562f9fc2386ca5d1176cb2e4a29ec82/signature-11: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 18:04:15 crc kubenswrapper[4752]: E0227 18:04:15.001618 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/metallb-rhel9-operator@sha256=5362b055219db524ad33db3122acfcf8b562f9fc2386ca5d1176cb2e4a29ec82/signature-11: status 500 (Internal Server Error)\"" pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" podUID="5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0" Feb 27 18:04:15 crc kubenswrapper[4752]: E0227 18:04:15.947667 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:0f668226ec5fdc1726e9df3bb807b172040b59313117c8cbed8ade8e730a2225\\\"\"" pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" podUID="5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0" Feb 27 18:04:20 crc kubenswrapper[4752]: E0227 18:04:20.650618 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/metallb-rhel9@sha256=6fd8cc1924db80d00ff2e105c8dcfcb466bc6a501b46d78c6da38a2a6f4f60a1/signature-11: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9" Feb 27 18:04:20 crc kubenswrapper[4752]: E0227 18:04:20.651098 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:webhook-server,Image:registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9,Command:[/controller],Args:[--disable-cert-rotation=true --port=7472 --log-level=info --webhook-mode=onlywebhook],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:monitoring,HostPort:0,ContainerPort:7472,Protocol:TCP,HostIP:,},ContainerPort{Name:webhook-server,HostPort:0,ContainerPort:9443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:METALLB_BGP_TYPE,Value:frr,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:metallb-operator.v4.18.0-202602140741,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2gqbs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/metrics,Port:{1 0 monitoring},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/metrics,Port:{1 0 monitoring},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000700000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod metallb-operator-webhook-server-7f65d79b55-6j6sn_metallb-system(d07bbb76-6065-4740-a3d5-45d511110e25): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/metallb-rhel9@sha256=6fd8cc1924db80d00ff2e105c8dcfcb466bc6a501b46d78c6da38a2a6f4f60a1/signature-11: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 18:04:20 crc kubenswrapper[4752]: E0227 18:04:20.652324 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"webhook-server\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/metallb-rhel9@sha256=6fd8cc1924db80d00ff2e105c8dcfcb466bc6a501b46d78c6da38a2a6f4f60a1/signature-11: status 500 (Internal Server Error)\"" pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" podUID="d07bbb76-6065-4740-a3d5-45d511110e25" Feb 27 18:04:20 crc kubenswrapper[4752]: E0227 18:04:20.974589 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"webhook-server\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9\\\"\"" pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" podUID="d07bbb76-6065-4740-a3d5-45d511110e25" Feb 27 18:04:21 crc kubenswrapper[4752]: E0227 18:04:21.087168 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 27 18:04:21 crc kubenswrapper[4752]: E0227 18:04:21.087295 4752 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 18:04:21 crc kubenswrapper[4752]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 27 18:04:21 crc kubenswrapper[4752]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-76h2g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29536924-2vf87_openshift-infra(cf1f77d3-d6cf-4926-b04c-31d88fffeba1): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error) Feb 27 18:04:21 crc kubenswrapper[4752]: > logger="UnhandledError" Feb 27 18:04:21 crc kubenswrapper[4752]: E0227 18:04:21.088463 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)\"" pod="openshift-infra/auto-csr-approver-29536924-2vf87" podUID="cf1f77d3-d6cf-4926-b04c-31d88fffeba1" Feb 27 18:04:21 crc kubenswrapper[4752]: E0227 18:04:21.980190 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536924-2vf87" podUID="cf1f77d3-d6cf-4926-b04c-31d88fffeba1" Feb 27 18:04:22 crc kubenswrapper[4752]: I0227 18:04:22.909295 4752 scope.go:117] "RemoveContainer" containerID="2d15bd23de76affad936b2b453b80049294c778d3446231b8cd3ac1c0f04c31a" Feb 27 18:04:22 crc kubenswrapper[4752]: E0227 18:04:22.909905 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cm8wb_openshift-machine-config-operator(53ce186c-640f-4ade-94e1-587c1440fe87)\"" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" Feb 27 18:04:25 crc kubenswrapper[4752]: E0227 18:04:25.909743 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-f98lg" podUID="3134fb27-d7aa-44ce-8d34-440774dd286b" Feb 27 18:04:29 crc kubenswrapper[4752]: I0227 18:04:29.019314 4752 generic.go:334] "Generic (PLEG): container finished" podID="d92b437a-52a3-4c88-8ae9-a86b0e60fe5a" containerID="14e56f29de7613e6e6827424d67338ffdb6d2a6dd79b6e1602c8ad8e8afc684f" exitCode=0 Feb 27 18:04:29 crc kubenswrapper[4752]: I0227 18:04:29.019412 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ql8m" event={"ID":"d92b437a-52a3-4c88-8ae9-a86b0e60fe5a","Type":"ContainerDied","Data":"14e56f29de7613e6e6827424d67338ffdb6d2a6dd79b6e1602c8ad8e8afc684f"} Feb 27 18:04:31 crc kubenswrapper[4752]: I0227 18:04:31.047054 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ql8m" event={"ID":"d92b437a-52a3-4c88-8ae9-a86b0e60fe5a","Type":"ContainerStarted","Data":"711c7bcc184161294fef3cc8d39581ac546e907eabc29bc0262bb6ff7b9d9abf"} Feb 27 18:04:31 crc kubenswrapper[4752]: I0227 18:04:31.081307 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8ql8m" podStartSLOduration=3.161544503 podStartE2EDuration="1m38.081290513s" podCreationTimestamp="2026-02-27 18:02:53 +0000 UTC" firstStartedPulling="2026-02-27 18:02:55.316053875 +0000 UTC m=+1675.222870716" lastFinishedPulling="2026-02-27 18:04:30.235799875 +0000 UTC m=+1770.142616726" observedRunningTime="2026-02-27 18:04:31.072743602 +0000 UTC m=+1770.979560463" watchObservedRunningTime="2026-02-27 18:04:31.081290513 +0000 UTC m=+1770.988107374" Feb 27 18:04:31 crc kubenswrapper[4752]: E0227 18:04:31.093704 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/metallb-rhel9-operator@sha256=5362b055219db524ad33db3122acfcf8b562f9fc2386ca5d1176cb2e4a29ec82/signature-11: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:0f668226ec5fdc1726e9df3bb807b172040b59313117c8cbed8ade8e730a2225" Feb 27 18:04:31 crc kubenswrapper[4752]: E0227 18:04:31.094039 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:0f668226ec5fdc1726e9df3bb807b172040b59313117c8cbed8ade8e730a2225,Command:[/manager],Args:[--enable-leader-election --disable-cert-rotation=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:webhook-server,HostPort:0,ContainerPort:9443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:SPEAKER_IMAGE,Value:registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9,ValueFrom:nil,},EnvVar{Name:CONTROLLER_IMAGE,Value:registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9,ValueFrom:nil,},EnvVar{Name:FRR_IMAGE,Value:registry.redhat.io/openshift4/frr-rhel9@sha256:d7e76e936159ed04e779a66d421cc3ecc6c82409e8eed924112d9174c3d6aad9,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:registry.redhat.io/openshift4/ose-kube-rbac-proxy-rhel9@sha256:787be45b5241419b6819676d43325a9030c0e16441918e4a33a44f0380d6b902,ValueFrom:nil,},EnvVar{Name:DEPLOY_KUBE_RBAC_PROXIES,Value:true,ValueFrom:nil,},EnvVar{Name:FRRK8S_IMAGE,Value:registry.redhat.io/openshift4/frr-rhel9@sha256:d7e76e936159ed04e779a66d421cc3ecc6c82409e8eed924112d9174c3d6aad9,ValueFrom:nil,},EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:DEPLOY_PODMONITORS,Value:false,ValueFrom:nil,},EnvVar{Name:DEPLOY_SERVICEMONITORS,Value:true,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOK,Value:true,ValueFrom:nil,},EnvVar{Name:ENABLE_OPERATOR_WEBHOOK,Value:true,ValueFrom:nil,},EnvVar{Name:METRICS_PORT,Value:29150,ValueFrom:nil,},EnvVar{Name:HTTPS_METRICS_PORT,Value:9120,ValueFrom:nil,},EnvVar{Name:FRR_METRICS_PORT,Value:29151,ValueFrom:nil,},EnvVar{Name:FRR_HTTPS_METRICS_PORT,Value:9121,ValueFrom:nil,},EnvVar{Name:MEMBER_LIST_BIND_PORT,Value:9122,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:metallb-operator.v4.18.0-202602140741,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hxltq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000700000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod metallb-operator-controller-manager-55b4885b68-sg4ds_metallb-system(5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/metallb-rhel9-operator@sha256=5362b055219db524ad33db3122acfcf8b562f9fc2386ca5d1176cb2e4a29ec82/signature-11: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 18:04:31 crc kubenswrapper[4752]: E0227 18:04:31.109087 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/metallb-rhel9-operator@sha256=5362b055219db524ad33db3122acfcf8b562f9fc2386ca5d1176cb2e4a29ec82/signature-11: status 500 (Internal Server Error)\"" pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" podUID="5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0" Feb 27 18:04:33 crc kubenswrapper[4752]: I0227 18:04:33.960736 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8ql8m" Feb 27 18:04:33 crc kubenswrapper[4752]: I0227 18:04:33.962618 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8ql8m" Feb 27 18:04:34 crc kubenswrapper[4752]: I0227 18:04:34.010387 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8ql8m" Feb 27 18:04:34 crc kubenswrapper[4752]: I0227 18:04:34.907989 4752 scope.go:117] "RemoveContainer" containerID="2d15bd23de76affad936b2b453b80049294c778d3446231b8cd3ac1c0f04c31a" Feb 27 18:04:34 crc kubenswrapper[4752]: E0227 18:04:34.908347 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cm8wb_openshift-machine-config-operator(53ce186c-640f-4ade-94e1-587c1440fe87)\"" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" Feb 27 18:04:35 crc kubenswrapper[4752]: I0227 18:04:35.113209 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8ql8m" Feb 27 18:04:35 crc kubenswrapper[4752]: I0227 18:04:35.589578 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8ql8m"] Feb 27 18:04:37 crc kubenswrapper[4752]: I0227 18:04:37.080666 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8ql8m" podUID="d92b437a-52a3-4c88-8ae9-a86b0e60fe5a" containerName="registry-server" containerID="cri-o://711c7bcc184161294fef3cc8d39581ac546e907eabc29bc0262bb6ff7b9d9abf" gracePeriod=2 Feb 27 18:04:37 crc kubenswrapper[4752]: I0227 18:04:37.535301 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8ql8m" Feb 27 18:04:37 crc kubenswrapper[4752]: I0227 18:04:37.651204 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d92b437a-52a3-4c88-8ae9-a86b0e60fe5a-utilities\") pod \"d92b437a-52a3-4c88-8ae9-a86b0e60fe5a\" (UID: \"d92b437a-52a3-4c88-8ae9-a86b0e60fe5a\") " Feb 27 18:04:37 crc kubenswrapper[4752]: I0227 18:04:37.651460 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d92b437a-52a3-4c88-8ae9-a86b0e60fe5a-catalog-content\") pod \"d92b437a-52a3-4c88-8ae9-a86b0e60fe5a\" (UID: \"d92b437a-52a3-4c88-8ae9-a86b0e60fe5a\") " Feb 27 18:04:37 crc kubenswrapper[4752]: I0227 18:04:37.651577 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2z4l5\" (UniqueName: \"kubernetes.io/projected/d92b437a-52a3-4c88-8ae9-a86b0e60fe5a-kube-api-access-2z4l5\") pod \"d92b437a-52a3-4c88-8ae9-a86b0e60fe5a\" (UID: \"d92b437a-52a3-4c88-8ae9-a86b0e60fe5a\") " Feb 27 18:04:37 crc kubenswrapper[4752]: I0227 18:04:37.652236 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d92b437a-52a3-4c88-8ae9-a86b0e60fe5a-utilities" (OuterVolumeSpecName: "utilities") pod "d92b437a-52a3-4c88-8ae9-a86b0e60fe5a" (UID: "d92b437a-52a3-4c88-8ae9-a86b0e60fe5a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 18:04:37 crc kubenswrapper[4752]: I0227 18:04:37.659441 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d92b437a-52a3-4c88-8ae9-a86b0e60fe5a-kube-api-access-2z4l5" (OuterVolumeSpecName: "kube-api-access-2z4l5") pod "d92b437a-52a3-4c88-8ae9-a86b0e60fe5a" (UID: "d92b437a-52a3-4c88-8ae9-a86b0e60fe5a"). InnerVolumeSpecName "kube-api-access-2z4l5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 18:04:37 crc kubenswrapper[4752]: I0227 18:04:37.715286 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d92b437a-52a3-4c88-8ae9-a86b0e60fe5a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d92b437a-52a3-4c88-8ae9-a86b0e60fe5a" (UID: "d92b437a-52a3-4c88-8ae9-a86b0e60fe5a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 18:04:37 crc kubenswrapper[4752]: I0227 18:04:37.752914 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2z4l5\" (UniqueName: \"kubernetes.io/projected/d92b437a-52a3-4c88-8ae9-a86b0e60fe5a-kube-api-access-2z4l5\") on node \"crc\" DevicePath \"\"" Feb 27 18:04:37 crc kubenswrapper[4752]: I0227 18:04:37.752955 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d92b437a-52a3-4c88-8ae9-a86b0e60fe5a-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 18:04:37 crc kubenswrapper[4752]: I0227 18:04:37.752968 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d92b437a-52a3-4c88-8ae9-a86b0e60fe5a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 18:04:38 crc kubenswrapper[4752]: I0227 18:04:38.091097 4752 generic.go:334] "Generic (PLEG): container finished" podID="d92b437a-52a3-4c88-8ae9-a86b0e60fe5a" containerID="711c7bcc184161294fef3cc8d39581ac546e907eabc29bc0262bb6ff7b9d9abf" exitCode=0 Feb 27 18:04:38 crc kubenswrapper[4752]: I0227 18:04:38.091187 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ql8m" event={"ID":"d92b437a-52a3-4c88-8ae9-a86b0e60fe5a","Type":"ContainerDied","Data":"711c7bcc184161294fef3cc8d39581ac546e907eabc29bc0262bb6ff7b9d9abf"} Feb 27 18:04:38 crc kubenswrapper[4752]: I0227 18:04:38.091233 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8ql8m" Feb 27 18:04:38 crc kubenswrapper[4752]: I0227 18:04:38.091268 4752 scope.go:117] "RemoveContainer" containerID="711c7bcc184161294fef3cc8d39581ac546e907eabc29bc0262bb6ff7b9d9abf" Feb 27 18:04:38 crc kubenswrapper[4752]: I0227 18:04:38.091232 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ql8m" event={"ID":"d92b437a-52a3-4c88-8ae9-a86b0e60fe5a","Type":"ContainerDied","Data":"1028b67cdf88d0ff8bc46b591d2ea2209fc504cfa83ce31dbc426ef6a7459041"} Feb 27 18:04:38 crc kubenswrapper[4752]: I0227 18:04:38.124380 4752 scope.go:117] "RemoveContainer" containerID="14e56f29de7613e6e6827424d67338ffdb6d2a6dd79b6e1602c8ad8e8afc684f" Feb 27 18:04:38 crc kubenswrapper[4752]: I0227 18:04:38.144481 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8ql8m"] Feb 27 18:04:38 crc kubenswrapper[4752]: I0227 18:04:38.149340 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8ql8m"] Feb 27 18:04:38 crc kubenswrapper[4752]: I0227 18:04:38.177525 4752 scope.go:117] "RemoveContainer" containerID="f87fc19b4ecae2ac7b39a24042241916707dbef6469557e6152b4b4a3072d882" Feb 27 18:04:38 crc kubenswrapper[4752]: I0227 18:04:38.196734 4752 scope.go:117] "RemoveContainer" containerID="711c7bcc184161294fef3cc8d39581ac546e907eabc29bc0262bb6ff7b9d9abf" Feb 27 18:04:38 crc kubenswrapper[4752]: E0227 18:04:38.197225 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"711c7bcc184161294fef3cc8d39581ac546e907eabc29bc0262bb6ff7b9d9abf\": container with ID starting with 711c7bcc184161294fef3cc8d39581ac546e907eabc29bc0262bb6ff7b9d9abf not found: ID does not exist" containerID="711c7bcc184161294fef3cc8d39581ac546e907eabc29bc0262bb6ff7b9d9abf" Feb 27 18:04:38 crc kubenswrapper[4752]: I0227 18:04:38.197276 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"711c7bcc184161294fef3cc8d39581ac546e907eabc29bc0262bb6ff7b9d9abf"} err="failed to get container status \"711c7bcc184161294fef3cc8d39581ac546e907eabc29bc0262bb6ff7b9d9abf\": rpc error: code = NotFound desc = could not find container \"711c7bcc184161294fef3cc8d39581ac546e907eabc29bc0262bb6ff7b9d9abf\": container with ID starting with 711c7bcc184161294fef3cc8d39581ac546e907eabc29bc0262bb6ff7b9d9abf not found: ID does not exist" Feb 27 18:04:38 crc kubenswrapper[4752]: I0227 18:04:38.197309 4752 scope.go:117] "RemoveContainer" containerID="14e56f29de7613e6e6827424d67338ffdb6d2a6dd79b6e1602c8ad8e8afc684f" Feb 27 18:04:38 crc kubenswrapper[4752]: E0227 18:04:38.198051 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14e56f29de7613e6e6827424d67338ffdb6d2a6dd79b6e1602c8ad8e8afc684f\": container with ID starting with 14e56f29de7613e6e6827424d67338ffdb6d2a6dd79b6e1602c8ad8e8afc684f not found: ID does not exist" containerID="14e56f29de7613e6e6827424d67338ffdb6d2a6dd79b6e1602c8ad8e8afc684f" Feb 27 18:04:38 crc kubenswrapper[4752]: I0227 18:04:38.198098 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14e56f29de7613e6e6827424d67338ffdb6d2a6dd79b6e1602c8ad8e8afc684f"} err="failed to get container status \"14e56f29de7613e6e6827424d67338ffdb6d2a6dd79b6e1602c8ad8e8afc684f\": rpc error: code = NotFound desc = could not find container \"14e56f29de7613e6e6827424d67338ffdb6d2a6dd79b6e1602c8ad8e8afc684f\": container with ID starting with 14e56f29de7613e6e6827424d67338ffdb6d2a6dd79b6e1602c8ad8e8afc684f not found: ID does not exist" Feb 27 18:04:38 crc kubenswrapper[4752]: I0227 18:04:38.198125 4752 scope.go:117] "RemoveContainer" containerID="f87fc19b4ecae2ac7b39a24042241916707dbef6469557e6152b4b4a3072d882" Feb 27 18:04:38 crc kubenswrapper[4752]: E0227 18:04:38.199783 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f87fc19b4ecae2ac7b39a24042241916707dbef6469557e6152b4b4a3072d882\": container with ID starting with f87fc19b4ecae2ac7b39a24042241916707dbef6469557e6152b4b4a3072d882 not found: ID does not exist" containerID="f87fc19b4ecae2ac7b39a24042241916707dbef6469557e6152b4b4a3072d882" Feb 27 18:04:38 crc kubenswrapper[4752]: I0227 18:04:38.199831 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f87fc19b4ecae2ac7b39a24042241916707dbef6469557e6152b4b4a3072d882"} err="failed to get container status \"f87fc19b4ecae2ac7b39a24042241916707dbef6469557e6152b4b4a3072d882\": rpc error: code = NotFound desc = could not find container \"f87fc19b4ecae2ac7b39a24042241916707dbef6469557e6152b4b4a3072d882\": container with ID starting with f87fc19b4ecae2ac7b39a24042241916707dbef6469557e6152b4b4a3072d882 not found: ID does not exist" Feb 27 18:04:38 crc kubenswrapper[4752]: I0227 18:04:38.914646 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d92b437a-52a3-4c88-8ae9-a86b0e60fe5a" path="/var/lib/kubelet/pods/d92b437a-52a3-4c88-8ae9-a86b0e60fe5a/volumes" Feb 27 18:04:39 crc kubenswrapper[4752]: I0227 18:04:39.102063 4752 generic.go:334] "Generic (PLEG): container finished" podID="cf1f77d3-d6cf-4926-b04c-31d88fffeba1" containerID="0c6f0479f3559237eb457d476fa88ef657df2fba434d5974902bec62bee806cb" exitCode=0 Feb 27 18:04:39 crc kubenswrapper[4752]: I0227 18:04:39.102117 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536924-2vf87" event={"ID":"cf1f77d3-d6cf-4926-b04c-31d88fffeba1","Type":"ContainerDied","Data":"0c6f0479f3559237eb457d476fa88ef657df2fba434d5974902bec62bee806cb"} Feb 27 18:04:40 crc kubenswrapper[4752]: I0227 18:04:40.409727 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536924-2vf87" Feb 27 18:04:40 crc kubenswrapper[4752]: I0227 18:04:40.592376 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76h2g\" (UniqueName: \"kubernetes.io/projected/cf1f77d3-d6cf-4926-b04c-31d88fffeba1-kube-api-access-76h2g\") pod \"cf1f77d3-d6cf-4926-b04c-31d88fffeba1\" (UID: \"cf1f77d3-d6cf-4926-b04c-31d88fffeba1\") " Feb 27 18:04:40 crc kubenswrapper[4752]: I0227 18:04:40.602833 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf1f77d3-d6cf-4926-b04c-31d88fffeba1-kube-api-access-76h2g" (OuterVolumeSpecName: "kube-api-access-76h2g") pod "cf1f77d3-d6cf-4926-b04c-31d88fffeba1" (UID: "cf1f77d3-d6cf-4926-b04c-31d88fffeba1"). InnerVolumeSpecName "kube-api-access-76h2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 18:04:40 crc kubenswrapper[4752]: I0227 18:04:40.694118 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76h2g\" (UniqueName: \"kubernetes.io/projected/cf1f77d3-d6cf-4926-b04c-31d88fffeba1-kube-api-access-76h2g\") on node \"crc\" DevicePath \"\"" Feb 27 18:04:41 crc kubenswrapper[4752]: I0227 18:04:41.117129 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536924-2vf87" event={"ID":"cf1f77d3-d6cf-4926-b04c-31d88fffeba1","Type":"ContainerDied","Data":"d82df7e0307e76584899b31b8f53bc28db6a375a999a506673cb6257a6240388"} Feb 27 18:04:41 crc kubenswrapper[4752]: I0227 18:04:41.117417 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536924-2vf87" Feb 27 18:04:41 crc kubenswrapper[4752]: I0227 18:04:41.117433 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d82df7e0307e76584899b31b8f53bc28db6a375a999a506673cb6257a6240388" Feb 27 18:04:41 crc kubenswrapper[4752]: I0227 18:04:41.479118 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536918-ztclp"] Feb 27 18:04:41 crc kubenswrapper[4752]: I0227 18:04:41.486137 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536918-ztclp"] Feb 27 18:04:42 crc kubenswrapper[4752]: E0227 18:04:42.045645 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error)" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 27 18:04:42 crc kubenswrapper[4752]: E0227 18:04:42.045868 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6nc7x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-f98lg_openshift-marketplace(3134fb27-d7aa-44ce-8d34-440774dd286b): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 18:04:42 crc kubenswrapper[4752]: E0227 18:04:42.047205 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/redhat/redhat-marketplace-index@sha256=e848a00af7690cfa41500b98e0e7a0b9738ce0af7b6b4fee3ea20e0838523c30/signature-2: status 500 (Internal Server Error)\"" pod="openshift-marketplace/redhat-marketplace-f98lg" podUID="3134fb27-d7aa-44ce-8d34-440774dd286b" Feb 27 18:04:42 crc kubenswrapper[4752]: E0227 18:04:42.909339 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:0f668226ec5fdc1726e9df3bb807b172040b59313117c8cbed8ade8e730a2225\\\"\"" pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" podUID="5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0" Feb 27 18:04:42 crc kubenswrapper[4752]: I0227 18:04:42.919316 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3cc92ab-e406-46bd-8b55-6bc73db57254" path="/var/lib/kubelet/pods/f3cc92ab-e406-46bd-8b55-6bc73db57254/volumes" Feb 27 18:04:48 crc kubenswrapper[4752]: I0227 18:04:48.907287 4752 scope.go:117] "RemoveContainer" containerID="2d15bd23de76affad936b2b453b80049294c778d3446231b8cd3ac1c0f04c31a" Feb 27 18:04:48 crc kubenswrapper[4752]: E0227 18:04:48.908730 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cm8wb_openshift-machine-config-operator(53ce186c-640f-4ade-94e1-587c1440fe87)\"" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" Feb 27 18:04:55 crc kubenswrapper[4752]: E0227 18:04:55.909495 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-f98lg" podUID="3134fb27-d7aa-44ce-8d34-440774dd286b" Feb 27 18:04:56 crc kubenswrapper[4752]: E0227 18:04:56.149455 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/metallb-rhel9-operator@sha256=5362b055219db524ad33db3122acfcf8b562f9fc2386ca5d1176cb2e4a29ec82/signature-11: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:0f668226ec5fdc1726e9df3bb807b172040b59313117c8cbed8ade8e730a2225" Feb 27 18:04:56 crc kubenswrapper[4752]: E0227 18:04:56.149868 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:0f668226ec5fdc1726e9df3bb807b172040b59313117c8cbed8ade8e730a2225,Command:[/manager],Args:[--enable-leader-election --disable-cert-rotation=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:webhook-server,HostPort:0,ContainerPort:9443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:SPEAKER_IMAGE,Value:registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9,ValueFrom:nil,},EnvVar{Name:CONTROLLER_IMAGE,Value:registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9,ValueFrom:nil,},EnvVar{Name:FRR_IMAGE,Value:registry.redhat.io/openshift4/frr-rhel9@sha256:d7e76e936159ed04e779a66d421cc3ecc6c82409e8eed924112d9174c3d6aad9,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:registry.redhat.io/openshift4/ose-kube-rbac-proxy-rhel9@sha256:787be45b5241419b6819676d43325a9030c0e16441918e4a33a44f0380d6b902,ValueFrom:nil,},EnvVar{Name:DEPLOY_KUBE_RBAC_PROXIES,Value:true,ValueFrom:nil,},EnvVar{Name:FRRK8S_IMAGE,Value:registry.redhat.io/openshift4/frr-rhel9@sha256:d7e76e936159ed04e779a66d421cc3ecc6c82409e8eed924112d9174c3d6aad9,ValueFrom:nil,},EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:DEPLOY_PODMONITORS,Value:false,ValueFrom:nil,},EnvVar{Name:DEPLOY_SERVICEMONITORS,Value:true,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOK,Value:true,ValueFrom:nil,},EnvVar{Name:ENABLE_OPERATOR_WEBHOOK,Value:true,ValueFrom:nil,},EnvVar{Name:METRICS_PORT,Value:29150,ValueFrom:nil,},EnvVar{Name:HTTPS_METRICS_PORT,Value:9120,ValueFrom:nil,},EnvVar{Name:FRR_METRICS_PORT,Value:29151,ValueFrom:nil,},EnvVar{Name:FRR_HTTPS_METRICS_PORT,Value:9121,ValueFrom:nil,},EnvVar{Name:MEMBER_LIST_BIND_PORT,Value:9122,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:metallb-operator.v4.18.0-202602140741,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hxltq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000700000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod metallb-operator-controller-manager-55b4885b68-sg4ds_metallb-system(5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/metallb-rhel9-operator@sha256=5362b055219db524ad33db3122acfcf8b562f9fc2386ca5d1176cb2e4a29ec82/signature-11: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 18:04:56 crc kubenswrapper[4752]: E0227 18:04:56.151219 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/metallb-rhel9-operator@sha256=5362b055219db524ad33db3122acfcf8b562f9fc2386ca5d1176cb2e4a29ec82/signature-11: status 500 (Internal Server Error)\"" pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" podUID="5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0" Feb 27 18:04:59 crc kubenswrapper[4752]: I0227 18:04:59.794634 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-5pwkv_5992a5c1-ae6c-4bae-98b4-ad86a24b2a4e/control-plane-machine-set-operator/0.log" Feb 27 18:04:59 crc kubenswrapper[4752]: I0227 18:04:59.888164 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-tjktn_72a3daf3-ca59-4211-9195-1b5c70e4de7c/kube-rbac-proxy/0.log" Feb 27 18:04:59 crc kubenswrapper[4752]: I0227 18:04:59.980677 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-tjktn_72a3daf3-ca59-4211-9195-1b5c70e4de7c/machine-api-operator/0.log" Feb 27 18:05:02 crc kubenswrapper[4752]: I0227 18:05:02.906537 4752 scope.go:117] "RemoveContainer" containerID="2d15bd23de76affad936b2b453b80049294c778d3446231b8cd3ac1c0f04c31a" Feb 27 18:05:02 crc kubenswrapper[4752]: E0227 18:05:02.906780 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cm8wb_openshift-machine-config-operator(53ce186c-640f-4ade-94e1-587c1440fe87)\"" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" Feb 27 18:05:03 crc kubenswrapper[4752]: I0227 18:05:03.790089 4752 scope.go:117] "RemoveContainer" containerID="1726d22bc08540dbaf8bee13b3890888ef4b1c1533e6f26bd9da7a3f0034e027" Feb 27 18:05:04 crc kubenswrapper[4752]: E0227 18:05:04.702218 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/metallb-rhel9@sha256=6fd8cc1924db80d00ff2e105c8dcfcb466bc6a501b46d78c6da38a2a6f4f60a1/signature-11: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9" Feb 27 18:05:04 crc kubenswrapper[4752]: E0227 18:05:04.702805 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:webhook-server,Image:registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9,Command:[/controller],Args:[--disable-cert-rotation=true --port=7472 --log-level=info --webhook-mode=onlywebhook],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:monitoring,HostPort:0,ContainerPort:7472,Protocol:TCP,HostIP:,},ContainerPort{Name:webhook-server,HostPort:0,ContainerPort:9443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:METALLB_BGP_TYPE,Value:frr,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:metallb-operator.v4.18.0-202602140741,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2gqbs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/metrics,Port:{1 0 monitoring},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/metrics,Port:{1 0 monitoring},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000700000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod metallb-operator-webhook-server-7f65d79b55-6j6sn_metallb-system(d07bbb76-6065-4740-a3d5-45d511110e25): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/metallb-rhel9@sha256=6fd8cc1924db80d00ff2e105c8dcfcb466bc6a501b46d78c6da38a2a6f4f60a1/signature-11: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 18:05:04 crc kubenswrapper[4752]: E0227 18:05:04.704128 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"webhook-server\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/metallb-rhel9@sha256=6fd8cc1924db80d00ff2e105c8dcfcb466bc6a501b46d78c6da38a2a6f4f60a1/signature-11: status 500 (Internal Server Error)\"" pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" podUID="d07bbb76-6065-4740-a3d5-45d511110e25" Feb 27 18:05:10 crc kubenswrapper[4752]: E0227 18:05:10.911634 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:0f668226ec5fdc1726e9df3bb807b172040b59313117c8cbed8ade8e730a2225\\\"\"" pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" podUID="5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0" Feb 27 18:05:10 crc kubenswrapper[4752]: E0227 18:05:10.912011 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-f98lg" podUID="3134fb27-d7aa-44ce-8d34-440774dd286b" Feb 27 18:05:13 crc kubenswrapper[4752]: I0227 18:05:13.906547 4752 scope.go:117] "RemoveContainer" containerID="2d15bd23de76affad936b2b453b80049294c778d3446231b8cd3ac1c0f04c31a" Feb 27 18:05:13 crc kubenswrapper[4752]: E0227 18:05:13.907199 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cm8wb_openshift-machine-config-operator(53ce186c-640f-4ade-94e1-587c1440fe87)\"" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" Feb 27 18:05:13 crc kubenswrapper[4752]: I0227 18:05:13.964824 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-2vhjd_834711ff-fc4f-4160-b828-c695168e91f0/cert-manager-controller/0.log" Feb 27 18:05:14 crc kubenswrapper[4752]: I0227 18:05:14.101917 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-jm9qg_f225bed8-10a2-4f7c-b1fa-cbd00a97e654/cert-manager-cainjector/0.log" Feb 27 18:05:14 crc kubenswrapper[4752]: I0227 18:05:14.135076 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-xvqws_a7ea8051-fa57-4c00-a8f8-2f4f696701d4/cert-manager-webhook/0.log" Feb 27 18:05:14 crc kubenswrapper[4752]: E0227 18:05:14.909238 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"webhook-server\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9\\\"\"" pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" podUID="d07bbb76-6065-4740-a3d5-45d511110e25" Feb 27 18:05:24 crc kubenswrapper[4752]: E0227 18:05:24.908546 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:0f668226ec5fdc1726e9df3bb807b172040b59313117c8cbed8ade8e730a2225\\\"\"" pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" podUID="5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0" Feb 27 18:05:24 crc kubenswrapper[4752]: I0227 18:05:24.908716 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 18:05:26 crc kubenswrapper[4752]: I0227 18:05:26.419516 4752 generic.go:334] "Generic (PLEG): container finished" podID="3134fb27-d7aa-44ce-8d34-440774dd286b" containerID="5b143f403f782af0a7d408feac7627434a784656c939e60cc65d59ff08cce770" exitCode=0 Feb 27 18:05:26 crc kubenswrapper[4752]: I0227 18:05:26.419721 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f98lg" event={"ID":"3134fb27-d7aa-44ce-8d34-440774dd286b","Type":"ContainerDied","Data":"5b143f403f782af0a7d408feac7627434a784656c939e60cc65d59ff08cce770"} Feb 27 18:05:27 crc kubenswrapper[4752]: I0227 18:05:27.426401 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f98lg" event={"ID":"3134fb27-d7aa-44ce-8d34-440774dd286b","Type":"ContainerStarted","Data":"0f8baeda2345b4040fe12aa1dbd4b4f6080cac7a8aaac03ad248a746fef0046f"} Feb 27 18:05:27 crc kubenswrapper[4752]: I0227 18:05:27.445950 4752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f98lg" podStartSLOduration=2.363168789 podStartE2EDuration="1m32.445934084s" podCreationTimestamp="2026-02-27 18:03:55 +0000 UTC" firstStartedPulling="2026-02-27 18:03:56.78717729 +0000 UTC m=+1736.693994181" lastFinishedPulling="2026-02-27 18:05:26.869942625 +0000 UTC m=+1826.776759476" observedRunningTime="2026-02-27 18:05:27.445500993 +0000 UTC m=+1827.352317844" watchObservedRunningTime="2026-02-27 18:05:27.445934084 +0000 UTC m=+1827.352750945" Feb 27 18:05:28 crc kubenswrapper[4752]: I0227 18:05:28.499816 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-v5s5z_2b08a312-7e68-44c9-831e-9cc02cb723c1/nmstate-console-plugin/0.log" Feb 27 18:05:28 crc kubenswrapper[4752]: I0227 18:05:28.682404 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-7wpqr_5fc4e66a-972d-4516-96ba-fa4b56a181a0/nmstate-handler/0.log" Feb 27 18:05:28 crc kubenswrapper[4752]: I0227 18:05:28.745278 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-l2g98_592dcc36-2b17-4a10-b182-b693490e83c7/kube-rbac-proxy/0.log" Feb 27 18:05:28 crc kubenswrapper[4752]: I0227 18:05:28.756595 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-l2g98_592dcc36-2b17-4a10-b182-b693490e83c7/nmstate-metrics/0.log" Feb 27 18:05:28 crc kubenswrapper[4752]: I0227 18:05:28.906578 4752 scope.go:117] "RemoveContainer" containerID="2d15bd23de76affad936b2b453b80049294c778d3446231b8cd3ac1c0f04c31a" Feb 27 18:05:28 crc kubenswrapper[4752]: E0227 18:05:28.906812 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cm8wb_openshift-machine-config-operator(53ce186c-640f-4ade-94e1-587c1440fe87)\"" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" Feb 27 18:05:28 crc kubenswrapper[4752]: I0227 18:05:28.935976 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-64src_696dcdcd-c551-48ea-853c-2797a874fdaa/nmstate-operator/0.log" Feb 27 18:05:28 crc kubenswrapper[4752]: I0227 18:05:28.974489 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-k6nhg_e3dda850-6a0b-454f-aeda-b11f6c5a7604/nmstate-webhook/0.log" Feb 27 18:05:29 crc kubenswrapper[4752]: E0227 18:05:29.392049 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/metallb-rhel9@sha256=6fd8cc1924db80d00ff2e105c8dcfcb466bc6a501b46d78c6da38a2a6f4f60a1/signature-11: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9" Feb 27 18:05:29 crc kubenswrapper[4752]: E0227 18:05:29.392365 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:webhook-server,Image:registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9,Command:[/controller],Args:[--disable-cert-rotation=true --port=7472 --log-level=info --webhook-mode=onlywebhook],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:monitoring,HostPort:0,ContainerPort:7472,Protocol:TCP,HostIP:,},ContainerPort{Name:webhook-server,HostPort:0,ContainerPort:9443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:METALLB_BGP_TYPE,Value:frr,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:metallb-operator.v4.18.0-202602140741,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2gqbs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/metrics,Port:{1 0 monitoring},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/metrics,Port:{1 0 monitoring},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000700000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod metallb-operator-webhook-server-7f65d79b55-6j6sn_metallb-system(d07bbb76-6065-4740-a3d5-45d511110e25): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/metallb-rhel9@sha256=6fd8cc1924db80d00ff2e105c8dcfcb466bc6a501b46d78c6da38a2a6f4f60a1/signature-11: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 18:05:29 crc kubenswrapper[4752]: E0227 18:05:29.393709 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"webhook-server\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/metallb-rhel9@sha256=6fd8cc1924db80d00ff2e105c8dcfcb466bc6a501b46d78c6da38a2a6f4f60a1/signature-11: status 500 (Internal Server Error)\"" pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" podUID="d07bbb76-6065-4740-a3d5-45d511110e25" Feb 27 18:05:35 crc kubenswrapper[4752]: I0227 18:05:35.826828 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f98lg" Feb 27 18:05:35 crc kubenswrapper[4752]: I0227 18:05:35.827512 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f98lg" Feb 27 18:05:35 crc kubenswrapper[4752]: I0227 18:05:35.891825 4752 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f98lg" Feb 27 18:05:35 crc kubenswrapper[4752]: E0227 18:05:35.914668 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:0f668226ec5fdc1726e9df3bb807b172040b59313117c8cbed8ade8e730a2225\\\"\"" pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" podUID="5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0" Feb 27 18:05:36 crc kubenswrapper[4752]: I0227 18:05:36.521983 4752 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f98lg" Feb 27 18:05:36 crc kubenswrapper[4752]: I0227 18:05:36.561906 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f98lg"] Feb 27 18:05:38 crc kubenswrapper[4752]: I0227 18:05:38.499526 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f98lg" podUID="3134fb27-d7aa-44ce-8d34-440774dd286b" containerName="registry-server" containerID="cri-o://0f8baeda2345b4040fe12aa1dbd4b4f6080cac7a8aaac03ad248a746fef0046f" gracePeriod=2 Feb 27 18:05:38 crc kubenswrapper[4752]: I0227 18:05:38.970047 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f98lg" Feb 27 18:05:39 crc kubenswrapper[4752]: I0227 18:05:39.119659 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3134fb27-d7aa-44ce-8d34-440774dd286b-utilities\") pod \"3134fb27-d7aa-44ce-8d34-440774dd286b\" (UID: \"3134fb27-d7aa-44ce-8d34-440774dd286b\") " Feb 27 18:05:39 crc kubenswrapper[4752]: I0227 18:05:39.119814 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3134fb27-d7aa-44ce-8d34-440774dd286b-catalog-content\") pod \"3134fb27-d7aa-44ce-8d34-440774dd286b\" (UID: \"3134fb27-d7aa-44ce-8d34-440774dd286b\") " Feb 27 18:05:39 crc kubenswrapper[4752]: I0227 18:05:39.119881 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nc7x\" (UniqueName: \"kubernetes.io/projected/3134fb27-d7aa-44ce-8d34-440774dd286b-kube-api-access-6nc7x\") pod \"3134fb27-d7aa-44ce-8d34-440774dd286b\" (UID: \"3134fb27-d7aa-44ce-8d34-440774dd286b\") " Feb 27 18:05:39 crc kubenswrapper[4752]: I0227 18:05:39.122715 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3134fb27-d7aa-44ce-8d34-440774dd286b-utilities" (OuterVolumeSpecName: "utilities") pod "3134fb27-d7aa-44ce-8d34-440774dd286b" (UID: "3134fb27-d7aa-44ce-8d34-440774dd286b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 18:05:39 crc kubenswrapper[4752]: I0227 18:05:39.126110 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3134fb27-d7aa-44ce-8d34-440774dd286b-kube-api-access-6nc7x" (OuterVolumeSpecName: "kube-api-access-6nc7x") pod "3134fb27-d7aa-44ce-8d34-440774dd286b" (UID: "3134fb27-d7aa-44ce-8d34-440774dd286b"). InnerVolumeSpecName "kube-api-access-6nc7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 18:05:39 crc kubenswrapper[4752]: I0227 18:05:39.154313 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3134fb27-d7aa-44ce-8d34-440774dd286b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3134fb27-d7aa-44ce-8d34-440774dd286b" (UID: "3134fb27-d7aa-44ce-8d34-440774dd286b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 18:05:39 crc kubenswrapper[4752]: I0227 18:05:39.220991 4752 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3134fb27-d7aa-44ce-8d34-440774dd286b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 27 18:05:39 crc kubenswrapper[4752]: I0227 18:05:39.221027 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nc7x\" (UniqueName: \"kubernetes.io/projected/3134fb27-d7aa-44ce-8d34-440774dd286b-kube-api-access-6nc7x\") on node \"crc\" DevicePath \"\"" Feb 27 18:05:39 crc kubenswrapper[4752]: I0227 18:05:39.221042 4752 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3134fb27-d7aa-44ce-8d34-440774dd286b-utilities\") on node \"crc\" DevicePath \"\"" Feb 27 18:05:39 crc kubenswrapper[4752]: I0227 18:05:39.510908 4752 generic.go:334] "Generic (PLEG): container finished" podID="3134fb27-d7aa-44ce-8d34-440774dd286b" containerID="0f8baeda2345b4040fe12aa1dbd4b4f6080cac7a8aaac03ad248a746fef0046f" exitCode=0 Feb 27 18:05:39 crc kubenswrapper[4752]: I0227 18:05:39.510979 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f98lg" event={"ID":"3134fb27-d7aa-44ce-8d34-440774dd286b","Type":"ContainerDied","Data":"0f8baeda2345b4040fe12aa1dbd4b4f6080cac7a8aaac03ad248a746fef0046f"} Feb 27 18:05:39 crc kubenswrapper[4752]: I0227 18:05:39.511025 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f98lg" event={"ID":"3134fb27-d7aa-44ce-8d34-440774dd286b","Type":"ContainerDied","Data":"5a3c8dba9686566c68f83b1da28e509ffcea86be73dee7d934992e8e5b023081"} Feb 27 18:05:39 crc kubenswrapper[4752]: I0227 18:05:39.511051 4752 scope.go:117] "RemoveContainer" containerID="0f8baeda2345b4040fe12aa1dbd4b4f6080cac7a8aaac03ad248a746fef0046f" Feb 27 18:05:39 crc kubenswrapper[4752]: I0227 18:05:39.511122 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f98lg" Feb 27 18:05:39 crc kubenswrapper[4752]: I0227 18:05:39.541214 4752 scope.go:117] "RemoveContainer" containerID="5b143f403f782af0a7d408feac7627434a784656c939e60cc65d59ff08cce770" Feb 27 18:05:39 crc kubenswrapper[4752]: I0227 18:05:39.550289 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f98lg"] Feb 27 18:05:39 crc kubenswrapper[4752]: I0227 18:05:39.557503 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f98lg"] Feb 27 18:05:39 crc kubenswrapper[4752]: I0227 18:05:39.569103 4752 scope.go:117] "RemoveContainer" containerID="52b60edd1ba56ecb269a9d3ed85919ba077271f8d53668dac45d96ad3499603e" Feb 27 18:05:39 crc kubenswrapper[4752]: I0227 18:05:39.594838 4752 scope.go:117] "RemoveContainer" containerID="0f8baeda2345b4040fe12aa1dbd4b4f6080cac7a8aaac03ad248a746fef0046f" Feb 27 18:05:39 crc kubenswrapper[4752]: E0227 18:05:39.595901 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f8baeda2345b4040fe12aa1dbd4b4f6080cac7a8aaac03ad248a746fef0046f\": container with ID starting with 0f8baeda2345b4040fe12aa1dbd4b4f6080cac7a8aaac03ad248a746fef0046f not found: ID does not exist" containerID="0f8baeda2345b4040fe12aa1dbd4b4f6080cac7a8aaac03ad248a746fef0046f" Feb 27 18:05:39 crc kubenswrapper[4752]: I0227 18:05:39.595950 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f8baeda2345b4040fe12aa1dbd4b4f6080cac7a8aaac03ad248a746fef0046f"} err="failed to get container status \"0f8baeda2345b4040fe12aa1dbd4b4f6080cac7a8aaac03ad248a746fef0046f\": rpc error: code = NotFound desc = could not find container \"0f8baeda2345b4040fe12aa1dbd4b4f6080cac7a8aaac03ad248a746fef0046f\": container with ID starting with 0f8baeda2345b4040fe12aa1dbd4b4f6080cac7a8aaac03ad248a746fef0046f not found: ID does not exist" Feb 27 18:05:39 crc kubenswrapper[4752]: I0227 18:05:39.595983 4752 scope.go:117] "RemoveContainer" containerID="5b143f403f782af0a7d408feac7627434a784656c939e60cc65d59ff08cce770" Feb 27 18:05:39 crc kubenswrapper[4752]: E0227 18:05:39.596500 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b143f403f782af0a7d408feac7627434a784656c939e60cc65d59ff08cce770\": container with ID starting with 5b143f403f782af0a7d408feac7627434a784656c939e60cc65d59ff08cce770 not found: ID does not exist" containerID="5b143f403f782af0a7d408feac7627434a784656c939e60cc65d59ff08cce770" Feb 27 18:05:39 crc kubenswrapper[4752]: I0227 18:05:39.596543 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b143f403f782af0a7d408feac7627434a784656c939e60cc65d59ff08cce770"} err="failed to get container status \"5b143f403f782af0a7d408feac7627434a784656c939e60cc65d59ff08cce770\": rpc error: code = NotFound desc = could not find container \"5b143f403f782af0a7d408feac7627434a784656c939e60cc65d59ff08cce770\": container with ID starting with 5b143f403f782af0a7d408feac7627434a784656c939e60cc65d59ff08cce770 not found: ID does not exist" Feb 27 18:05:39 crc kubenswrapper[4752]: I0227 18:05:39.596574 4752 scope.go:117] "RemoveContainer" containerID="52b60edd1ba56ecb269a9d3ed85919ba077271f8d53668dac45d96ad3499603e" Feb 27 18:05:39 crc kubenswrapper[4752]: E0227 18:05:39.596888 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52b60edd1ba56ecb269a9d3ed85919ba077271f8d53668dac45d96ad3499603e\": container with ID starting with 52b60edd1ba56ecb269a9d3ed85919ba077271f8d53668dac45d96ad3499603e not found: ID does not exist" containerID="52b60edd1ba56ecb269a9d3ed85919ba077271f8d53668dac45d96ad3499603e" Feb 27 18:05:39 crc kubenswrapper[4752]: I0227 18:05:39.596915 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52b60edd1ba56ecb269a9d3ed85919ba077271f8d53668dac45d96ad3499603e"} err="failed to get container status \"52b60edd1ba56ecb269a9d3ed85919ba077271f8d53668dac45d96ad3499603e\": rpc error: code = NotFound desc = could not find container \"52b60edd1ba56ecb269a9d3ed85919ba077271f8d53668dac45d96ad3499603e\": container with ID starting with 52b60edd1ba56ecb269a9d3ed85919ba077271f8d53668dac45d96ad3499603e not found: ID does not exist" Feb 27 18:05:40 crc kubenswrapper[4752]: E0227 18:05:40.916841 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"webhook-server\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9\\\"\"" pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" podUID="d07bbb76-6065-4740-a3d5-45d511110e25" Feb 27 18:05:40 crc kubenswrapper[4752]: I0227 18:05:40.921993 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3134fb27-d7aa-44ce-8d34-440774dd286b" path="/var/lib/kubelet/pods/3134fb27-d7aa-44ce-8d34-440774dd286b/volumes" Feb 27 18:05:42 crc kubenswrapper[4752]: I0227 18:05:42.907623 4752 scope.go:117] "RemoveContainer" containerID="2d15bd23de76affad936b2b453b80049294c778d3446231b8cd3ac1c0f04c31a" Feb 27 18:05:42 crc kubenswrapper[4752]: E0227 18:05:42.910197 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cm8wb_openshift-machine-config-operator(53ce186c-640f-4ade-94e1-587c1440fe87)\"" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" Feb 27 18:05:51 crc kubenswrapper[4752]: E0227 18:05:51.449627 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/metallb-rhel9-operator@sha256=5362b055219db524ad33db3122acfcf8b562f9fc2386ca5d1176cb2e4a29ec82/signature-11: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:0f668226ec5fdc1726e9df3bb807b172040b59313117c8cbed8ade8e730a2225" Feb 27 18:05:51 crc kubenswrapper[4752]: E0227 18:05:51.450849 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:0f668226ec5fdc1726e9df3bb807b172040b59313117c8cbed8ade8e730a2225,Command:[/manager],Args:[--enable-leader-election --disable-cert-rotation=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:webhook-server,HostPort:0,ContainerPort:9443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:SPEAKER_IMAGE,Value:registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9,ValueFrom:nil,},EnvVar{Name:CONTROLLER_IMAGE,Value:registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9,ValueFrom:nil,},EnvVar{Name:FRR_IMAGE,Value:registry.redhat.io/openshift4/frr-rhel9@sha256:d7e76e936159ed04e779a66d421cc3ecc6c82409e8eed924112d9174c3d6aad9,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:registry.redhat.io/openshift4/ose-kube-rbac-proxy-rhel9@sha256:787be45b5241419b6819676d43325a9030c0e16441918e4a33a44f0380d6b902,ValueFrom:nil,},EnvVar{Name:DEPLOY_KUBE_RBAC_PROXIES,Value:true,ValueFrom:nil,},EnvVar{Name:FRRK8S_IMAGE,Value:registry.redhat.io/openshift4/frr-rhel9@sha256:d7e76e936159ed04e779a66d421cc3ecc6c82409e8eed924112d9174c3d6aad9,ValueFrom:nil,},EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:DEPLOY_PODMONITORS,Value:false,ValueFrom:nil,},EnvVar{Name:DEPLOY_SERVICEMONITORS,Value:true,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOK,Value:true,ValueFrom:nil,},EnvVar{Name:ENABLE_OPERATOR_WEBHOOK,Value:true,ValueFrom:nil,},EnvVar{Name:METRICS_PORT,Value:29150,ValueFrom:nil,},EnvVar{Name:HTTPS_METRICS_PORT,Value:9120,ValueFrom:nil,},EnvVar{Name:FRR_METRICS_PORT,Value:29151,ValueFrom:nil,},EnvVar{Name:FRR_HTTPS_METRICS_PORT,Value:9121,ValueFrom:nil,},EnvVar{Name:MEMBER_LIST_BIND_PORT,Value:9122,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:metallb-operator.v4.18.0-202602140741,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hxltq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000700000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod metallb-operator-controller-manager-55b4885b68-sg4ds_metallb-system(5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/metallb-rhel9-operator@sha256=5362b055219db524ad33db3122acfcf8b562f9fc2386ca5d1176cb2e4a29ec82/signature-11: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 18:05:51 crc kubenswrapper[4752]: E0227 18:05:51.452231 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/metallb-rhel9-operator@sha256=5362b055219db524ad33db3122acfcf8b562f9fc2386ca5d1176cb2e4a29ec82/signature-11: status 500 (Internal Server Error)\"" pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" podUID="5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0" Feb 27 18:05:55 crc kubenswrapper[4752]: E0227 18:05:55.909413 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"webhook-server\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9\\\"\"" pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" podUID="d07bbb76-6065-4740-a3d5-45d511110e25" Feb 27 18:05:57 crc kubenswrapper[4752]: I0227 18:05:57.906948 4752 scope.go:117] "RemoveContainer" containerID="2d15bd23de76affad936b2b453b80049294c778d3446231b8cd3ac1c0f04c31a" Feb 27 18:05:57 crc kubenswrapper[4752]: E0227 18:05:57.907343 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cm8wb_openshift-machine-config-operator(53ce186c-640f-4ade-94e1-587c1440fe87)\"" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" Feb 27 18:06:00 crc kubenswrapper[4752]: I0227 18:06:00.133503 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536926-zvlsw"] Feb 27 18:06:00 crc kubenswrapper[4752]: E0227 18:06:00.134312 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d92b437a-52a3-4c88-8ae9-a86b0e60fe5a" containerName="extract-utilities" Feb 27 18:06:00 crc kubenswrapper[4752]: I0227 18:06:00.134395 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92b437a-52a3-4c88-8ae9-a86b0e60fe5a" containerName="extract-utilities" Feb 27 18:06:00 crc kubenswrapper[4752]: E0227 18:06:00.134847 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3134fb27-d7aa-44ce-8d34-440774dd286b" containerName="extract-utilities" Feb 27 18:06:00 crc kubenswrapper[4752]: I0227 18:06:00.134904 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="3134fb27-d7aa-44ce-8d34-440774dd286b" containerName="extract-utilities" Feb 27 18:06:00 crc kubenswrapper[4752]: E0227 18:06:00.134960 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d92b437a-52a3-4c88-8ae9-a86b0e60fe5a" containerName="registry-server" Feb 27 18:06:00 crc kubenswrapper[4752]: I0227 18:06:00.135016 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92b437a-52a3-4c88-8ae9-a86b0e60fe5a" containerName="registry-server" Feb 27 18:06:00 crc kubenswrapper[4752]: E0227 18:06:00.135077 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf1f77d3-d6cf-4926-b04c-31d88fffeba1" containerName="oc" Feb 27 18:06:00 crc kubenswrapper[4752]: I0227 18:06:00.135132 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf1f77d3-d6cf-4926-b04c-31d88fffeba1" containerName="oc" Feb 27 18:06:00 crc kubenswrapper[4752]: E0227 18:06:00.135398 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3134fb27-d7aa-44ce-8d34-440774dd286b" containerName="registry-server" Feb 27 18:06:00 crc kubenswrapper[4752]: I0227 18:06:00.135451 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="3134fb27-d7aa-44ce-8d34-440774dd286b" containerName="registry-server" Feb 27 18:06:00 crc kubenswrapper[4752]: E0227 18:06:00.135518 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3134fb27-d7aa-44ce-8d34-440774dd286b" containerName="extract-content" Feb 27 18:06:00 crc kubenswrapper[4752]: I0227 18:06:00.135571 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="3134fb27-d7aa-44ce-8d34-440774dd286b" containerName="extract-content" Feb 27 18:06:00 crc kubenswrapper[4752]: E0227 18:06:00.135630 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d92b437a-52a3-4c88-8ae9-a86b0e60fe5a" containerName="extract-content" Feb 27 18:06:00 crc kubenswrapper[4752]: I0227 18:06:00.135683 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92b437a-52a3-4c88-8ae9-a86b0e60fe5a" containerName="extract-content" Feb 27 18:06:00 crc kubenswrapper[4752]: I0227 18:06:00.135849 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="d92b437a-52a3-4c88-8ae9-a86b0e60fe5a" containerName="registry-server" Feb 27 18:06:00 crc kubenswrapper[4752]: I0227 18:06:00.135912 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="3134fb27-d7aa-44ce-8d34-440774dd286b" containerName="registry-server" Feb 27 18:06:00 crc kubenswrapper[4752]: I0227 18:06:00.135976 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf1f77d3-d6cf-4926-b04c-31d88fffeba1" containerName="oc" Feb 27 18:06:00 crc kubenswrapper[4752]: I0227 18:06:00.136427 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536926-zvlsw" Feb 27 18:06:00 crc kubenswrapper[4752]: I0227 18:06:00.139275 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 27 18:06:00 crc kubenswrapper[4752]: I0227 18:06:00.142134 4752 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wtt6d" Feb 27 18:06:00 crc kubenswrapper[4752]: I0227 18:06:00.144971 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536926-zvlsw"] Feb 27 18:06:00 crc kubenswrapper[4752]: I0227 18:06:00.145305 4752 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 27 18:06:00 crc kubenswrapper[4752]: I0227 18:06:00.236111 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g49xh\" (UniqueName: \"kubernetes.io/projected/d054e9b4-5554-4da5-a57b-f834ad3fa861-kube-api-access-g49xh\") pod \"auto-csr-approver-29536926-zvlsw\" (UID: \"d054e9b4-5554-4da5-a57b-f834ad3fa861\") " pod="openshift-infra/auto-csr-approver-29536926-zvlsw" Feb 27 18:06:00 crc kubenswrapper[4752]: I0227 18:06:00.337197 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g49xh\" (UniqueName: \"kubernetes.io/projected/d054e9b4-5554-4da5-a57b-f834ad3fa861-kube-api-access-g49xh\") pod \"auto-csr-approver-29536926-zvlsw\" (UID: \"d054e9b4-5554-4da5-a57b-f834ad3fa861\") " pod="openshift-infra/auto-csr-approver-29536926-zvlsw" Feb 27 18:06:00 crc kubenswrapper[4752]: I0227 18:06:00.359296 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g49xh\" (UniqueName: \"kubernetes.io/projected/d054e9b4-5554-4da5-a57b-f834ad3fa861-kube-api-access-g49xh\") pod \"auto-csr-approver-29536926-zvlsw\" (UID: \"d054e9b4-5554-4da5-a57b-f834ad3fa861\") " pod="openshift-infra/auto-csr-approver-29536926-zvlsw" Feb 27 18:06:00 crc kubenswrapper[4752]: I0227 18:06:00.501423 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536926-zvlsw" Feb 27 18:06:00 crc kubenswrapper[4752]: I0227 18:06:00.977459 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536926-zvlsw"] Feb 27 18:06:01 crc kubenswrapper[4752]: I0227 18:06:01.665220 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536926-zvlsw" event={"ID":"d054e9b4-5554-4da5-a57b-f834ad3fa861","Type":"ContainerStarted","Data":"72c4ea2da2cc65d400c92739011238b2bf0c694784606787bdb2cdd626dad93b"} Feb 27 18:06:01 crc kubenswrapper[4752]: E0227 18:06:01.955542 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 27 18:06:01 crc kubenswrapper[4752]: E0227 18:06:01.955703 4752 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 18:06:01 crc kubenswrapper[4752]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 27 18:06:01 crc kubenswrapper[4752]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g49xh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29536926-zvlsw_openshift-infra(d054e9b4-5554-4da5-a57b-f834ad3fa861): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error) Feb 27 18:06:01 crc kubenswrapper[4752]: > logger="UnhandledError" Feb 27 18:06:01 crc kubenswrapper[4752]: E0227 18:06:01.956856 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)\"" pod="openshift-infra/auto-csr-approver-29536926-zvlsw" podUID="d054e9b4-5554-4da5-a57b-f834ad3fa861" Feb 27 18:06:02 crc kubenswrapper[4752]: E0227 18:06:02.673170 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536926-zvlsw" podUID="d054e9b4-5554-4da5-a57b-f834ad3fa861" Feb 27 18:06:06 crc kubenswrapper[4752]: E0227 18:06:06.911310 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:0f668226ec5fdc1726e9df3bb807b172040b59313117c8cbed8ade8e730a2225\\\"\"" pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" podUID="5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0" Feb 27 18:06:08 crc kubenswrapper[4752]: I0227 18:06:08.906964 4752 scope.go:117] "RemoveContainer" containerID="2d15bd23de76affad936b2b453b80049294c778d3446231b8cd3ac1c0f04c31a" Feb 27 18:06:08 crc kubenswrapper[4752]: E0227 18:06:08.908063 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cm8wb_openshift-machine-config-operator(53ce186c-640f-4ade-94e1-587c1440fe87)\"" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" Feb 27 18:06:12 crc kubenswrapper[4752]: E0227 18:06:12.261489 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/metallb-rhel9@sha256=6fd8cc1924db80d00ff2e105c8dcfcb466bc6a501b46d78c6da38a2a6f4f60a1/signature-11: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9" Feb 27 18:06:12 crc kubenswrapper[4752]: E0227 18:06:12.261959 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:webhook-server,Image:registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9,Command:[/controller],Args:[--disable-cert-rotation=true --port=7472 --log-level=info --webhook-mode=onlywebhook],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:monitoring,HostPort:0,ContainerPort:7472,Protocol:TCP,HostIP:,},ContainerPort{Name:webhook-server,HostPort:0,ContainerPort:9443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:METALLB_BGP_TYPE,Value:frr,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:metallb-operator.v4.18.0-202602140741,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2gqbs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/metrics,Port:{1 0 monitoring},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/metrics,Port:{1 0 monitoring},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000700000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod metallb-operator-webhook-server-7f65d79b55-6j6sn_metallb-system(d07bbb76-6065-4740-a3d5-45d511110e25): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/metallb-rhel9@sha256=6fd8cc1924db80d00ff2e105c8dcfcb466bc6a501b46d78c6da38a2a6f4f60a1/signature-11: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 18:06:12 crc kubenswrapper[4752]: E0227 18:06:12.263205 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"webhook-server\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/metallb-rhel9@sha256=6fd8cc1924db80d00ff2e105c8dcfcb466bc6a501b46d78c6da38a2a6f4f60a1/signature-11: status 500 (Internal Server Error)\"" pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" podUID="d07bbb76-6065-4740-a3d5-45d511110e25" Feb 27 18:06:13 crc kubenswrapper[4752]: I0227 18:06:13.920343 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82kfvph_cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3/util/0.log" Feb 27 18:06:14 crc kubenswrapper[4752]: I0227 18:06:14.073337 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82kfvph_cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3/util/0.log" Feb 27 18:06:14 crc kubenswrapper[4752]: I0227 18:06:14.110234 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82kfvph_cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3/pull/0.log" Feb 27 18:06:14 crc kubenswrapper[4752]: I0227 18:06:14.126852 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82kfvph_cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3/pull/0.log" Feb 27 18:06:14 crc kubenswrapper[4752]: I0227 18:06:14.354243 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82kfvph_cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3/util/0.log" Feb 27 18:06:14 crc kubenswrapper[4752]: I0227 18:06:14.385944 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82kfvph_cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3/pull/0.log" Feb 27 18:06:14 crc kubenswrapper[4752]: I0227 18:06:14.427096 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82kfvph_cbf6b4bc-d1ce-4138-aa53-46ef4dcffbe3/extract/0.log" Feb 27 18:06:14 crc kubenswrapper[4752]: I0227 18:06:14.495548 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p45jr_858dd67c-16d0-4e5f-b8a4-a93fec256951/extract-utilities/0.log" Feb 27 18:06:14 crc kubenswrapper[4752]: I0227 18:06:14.656101 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p45jr_858dd67c-16d0-4e5f-b8a4-a93fec256951/extract-utilities/0.log" Feb 27 18:06:14 crc kubenswrapper[4752]: I0227 18:06:14.687858 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p45jr_858dd67c-16d0-4e5f-b8a4-a93fec256951/extract-content/0.log" Feb 27 18:06:14 crc kubenswrapper[4752]: I0227 18:06:14.694134 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p45jr_858dd67c-16d0-4e5f-b8a4-a93fec256951/extract-content/0.log" Feb 27 18:06:14 crc kubenswrapper[4752]: I0227 18:06:14.867340 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p45jr_858dd67c-16d0-4e5f-b8a4-a93fec256951/extract-content/0.log" Feb 27 18:06:14 crc kubenswrapper[4752]: I0227 18:06:14.868010 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p45jr_858dd67c-16d0-4e5f-b8a4-a93fec256951/extract-utilities/0.log" Feb 27 18:06:15 crc kubenswrapper[4752]: I0227 18:06:15.027528 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tv7td_5295b606-ce89-48c6-809f-36bc6bbfd87f/extract-utilities/0.log" Feb 27 18:06:15 crc kubenswrapper[4752]: I0227 18:06:15.131231 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p45jr_858dd67c-16d0-4e5f-b8a4-a93fec256951/registry-server/0.log" Feb 27 18:06:15 crc kubenswrapper[4752]: I0227 18:06:15.233694 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tv7td_5295b606-ce89-48c6-809f-36bc6bbfd87f/extract-content/0.log" Feb 27 18:06:15 crc kubenswrapper[4752]: I0227 18:06:15.238203 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tv7td_5295b606-ce89-48c6-809f-36bc6bbfd87f/extract-content/0.log" Feb 27 18:06:15 crc kubenswrapper[4752]: I0227 18:06:15.258764 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tv7td_5295b606-ce89-48c6-809f-36bc6bbfd87f/extract-utilities/0.log" Feb 27 18:06:15 crc kubenswrapper[4752]: I0227 18:06:15.424034 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tv7td_5295b606-ce89-48c6-809f-36bc6bbfd87f/extract-content/0.log" Feb 27 18:06:15 crc kubenswrapper[4752]: I0227 18:06:15.427333 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tv7td_5295b606-ce89-48c6-809f-36bc6bbfd87f/extract-utilities/0.log" Feb 27 18:06:15 crc kubenswrapper[4752]: I0227 18:06:15.624846 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk_b3272ad9-e002-48b6-92b4-27da6186f45a/util/0.log" Feb 27 18:06:15 crc kubenswrapper[4752]: I0227 18:06:15.648992 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tv7td_5295b606-ce89-48c6-809f-36bc6bbfd87f/registry-server/0.log" Feb 27 18:06:15 crc kubenswrapper[4752]: I0227 18:06:15.778316 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk_b3272ad9-e002-48b6-92b4-27da6186f45a/util/0.log" Feb 27 18:06:15 crc kubenswrapper[4752]: I0227 18:06:15.791709 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk_b3272ad9-e002-48b6-92b4-27da6186f45a/pull/0.log" Feb 27 18:06:15 crc kubenswrapper[4752]: I0227 18:06:15.807177 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk_b3272ad9-e002-48b6-92b4-27da6186f45a/pull/0.log" Feb 27 18:06:15 crc kubenswrapper[4752]: E0227 18:06:15.847391 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 27 18:06:15 crc kubenswrapper[4752]: E0227 18:06:15.847519 4752 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 18:06:15 crc kubenswrapper[4752]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 27 18:06:15 crc kubenswrapper[4752]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g49xh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29536926-zvlsw_openshift-infra(d054e9b4-5554-4da5-a57b-f834ad3fa861): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error) Feb 27 18:06:15 crc kubenswrapper[4752]: > logger="UnhandledError" Feb 27 18:06:15 crc kubenswrapper[4752]: E0227 18:06:15.848707 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)\"" pod="openshift-infra/auto-csr-approver-29536926-zvlsw" podUID="d054e9b4-5554-4da5-a57b-f834ad3fa861" Feb 27 18:06:16 crc kubenswrapper[4752]: I0227 18:06:16.006040 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk_b3272ad9-e002-48b6-92b4-27da6186f45a/util/0.log" Feb 27 18:06:16 crc kubenswrapper[4752]: I0227 18:06:16.007211 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk_b3272ad9-e002-48b6-92b4-27da6186f45a/pull/0.log" Feb 27 18:06:16 crc kubenswrapper[4752]: I0227 18:06:16.026966 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l57kk_b3272ad9-e002-48b6-92b4-27da6186f45a/extract/0.log" Feb 27 18:06:16 crc kubenswrapper[4752]: I0227 18:06:16.226554 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wpsdf_cb10b603-75bd-4c13-b326-bcd1837e25c1/extract-utilities/0.log" Feb 27 18:06:16 crc kubenswrapper[4752]: I0227 18:06:16.230173 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-s9kms_96220b7c-718c-4eca-b3e3-46d1143e6124/marketplace-operator/0.log" Feb 27 18:06:16 crc kubenswrapper[4752]: I0227 18:06:16.367918 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wpsdf_cb10b603-75bd-4c13-b326-bcd1837e25c1/extract-utilities/0.log" Feb 27 18:06:16 crc kubenswrapper[4752]: I0227 18:06:16.370399 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wpsdf_cb10b603-75bd-4c13-b326-bcd1837e25c1/extract-content/0.log" Feb 27 18:06:16 crc kubenswrapper[4752]: I0227 18:06:16.473785 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wpsdf_cb10b603-75bd-4c13-b326-bcd1837e25c1/extract-content/0.log" Feb 27 18:06:16 crc kubenswrapper[4752]: I0227 18:06:16.602613 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wpsdf_cb10b603-75bd-4c13-b326-bcd1837e25c1/extract-content/0.log" Feb 27 18:06:16 crc kubenswrapper[4752]: I0227 18:06:16.632104 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wpsdf_cb10b603-75bd-4c13-b326-bcd1837e25c1/extract-utilities/0.log" Feb 27 18:06:16 crc kubenswrapper[4752]: I0227 18:06:16.686180 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wpsdf_cb10b603-75bd-4c13-b326-bcd1837e25c1/registry-server/0.log" Feb 27 18:06:16 crc kubenswrapper[4752]: I0227 18:06:16.825614 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s684v_48eed09c-cb79-4247-82f5-05a408fda589/extract-utilities/0.log" Feb 27 18:06:17 crc kubenswrapper[4752]: I0227 18:06:17.097862 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s684v_48eed09c-cb79-4247-82f5-05a408fda589/extract-utilities/0.log" Feb 27 18:06:17 crc kubenswrapper[4752]: I0227 18:06:17.103607 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s684v_48eed09c-cb79-4247-82f5-05a408fda589/extract-content/0.log" Feb 27 18:06:17 crc kubenswrapper[4752]: I0227 18:06:17.115782 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s684v_48eed09c-cb79-4247-82f5-05a408fda589/extract-content/0.log" Feb 27 18:06:17 crc kubenswrapper[4752]: I0227 18:06:17.288176 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s684v_48eed09c-cb79-4247-82f5-05a408fda589/extract-content/0.log" Feb 27 18:06:17 crc kubenswrapper[4752]: I0227 18:06:17.310914 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s684v_48eed09c-cb79-4247-82f5-05a408fda589/extract-utilities/0.log" Feb 27 18:06:17 crc kubenswrapper[4752]: I0227 18:06:17.546804 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-s684v_48eed09c-cb79-4247-82f5-05a408fda589/registry-server/0.log" Feb 27 18:06:20 crc kubenswrapper[4752]: E0227 18:06:20.912619 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:0f668226ec5fdc1726e9df3bb807b172040b59313117c8cbed8ade8e730a2225\\\"\"" pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" podUID="5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0" Feb 27 18:06:21 crc kubenswrapper[4752]: I0227 18:06:21.907788 4752 scope.go:117] "RemoveContainer" containerID="2d15bd23de76affad936b2b453b80049294c778d3446231b8cd3ac1c0f04c31a" Feb 27 18:06:21 crc kubenswrapper[4752]: E0227 18:06:21.908293 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cm8wb_openshift-machine-config-operator(53ce186c-640f-4ade-94e1-587c1440fe87)\"" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" Feb 27 18:06:24 crc kubenswrapper[4752]: E0227 18:06:24.908814 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"webhook-server\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9\\\"\"" pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" podUID="d07bbb76-6065-4740-a3d5-45d511110e25" Feb 27 18:06:30 crc kubenswrapper[4752]: E0227 18:06:30.914491 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536926-zvlsw" podUID="d054e9b4-5554-4da5-a57b-f834ad3fa861" Feb 27 18:06:32 crc kubenswrapper[4752]: I0227 18:06:32.907601 4752 scope.go:117] "RemoveContainer" containerID="2d15bd23de76affad936b2b453b80049294c778d3446231b8cd3ac1c0f04c31a" Feb 27 18:06:32 crc kubenswrapper[4752]: E0227 18:06:32.908372 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cm8wb_openshift-machine-config-operator(53ce186c-640f-4ade-94e1-587c1440fe87)\"" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" Feb 27 18:06:32 crc kubenswrapper[4752]: E0227 18:06:32.908513 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:0f668226ec5fdc1726e9df3bb807b172040b59313117c8cbed8ade8e730a2225\\\"\"" pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" podUID="5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0" Feb 27 18:06:39 crc kubenswrapper[4752]: E0227 18:06:39.909332 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"webhook-server\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9\\\"\"" pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" podUID="d07bbb76-6065-4740-a3d5-45d511110e25" Feb 27 18:06:46 crc kubenswrapper[4752]: E0227 18:06:46.474465 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 27 18:06:46 crc kubenswrapper[4752]: E0227 18:06:46.475084 4752 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 18:06:46 crc kubenswrapper[4752]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 27 18:06:46 crc kubenswrapper[4752]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g49xh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29536926-zvlsw_openshift-infra(d054e9b4-5554-4da5-a57b-f834ad3fa861): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error) Feb 27 18:06:46 crc kubenswrapper[4752]: > logger="UnhandledError" Feb 27 18:06:46 crc kubenswrapper[4752]: E0227 18:06:46.476875 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)\"" pod="openshift-infra/auto-csr-approver-29536926-zvlsw" podUID="d054e9b4-5554-4da5-a57b-f834ad3fa861" Feb 27 18:06:46 crc kubenswrapper[4752]: E0227 18:06:46.911897 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:0f668226ec5fdc1726e9df3bb807b172040b59313117c8cbed8ade8e730a2225\\\"\"" pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" podUID="5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0" Feb 27 18:06:47 crc kubenswrapper[4752]: I0227 18:06:47.906717 4752 scope.go:117] "RemoveContainer" containerID="2d15bd23de76affad936b2b453b80049294c778d3446231b8cd3ac1c0f04c31a" Feb 27 18:06:47 crc kubenswrapper[4752]: E0227 18:06:47.907185 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cm8wb_openshift-machine-config-operator(53ce186c-640f-4ade-94e1-587c1440fe87)\"" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" Feb 27 18:06:51 crc kubenswrapper[4752]: E0227 18:06:51.910026 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"webhook-server\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9\\\"\"" pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" podUID="d07bbb76-6065-4740-a3d5-45d511110e25" Feb 27 18:07:00 crc kubenswrapper[4752]: E0227 18:07:00.916528 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536926-zvlsw" podUID="d054e9b4-5554-4da5-a57b-f834ad3fa861" Feb 27 18:07:01 crc kubenswrapper[4752]: I0227 18:07:01.907356 4752 scope.go:117] "RemoveContainer" containerID="2d15bd23de76affad936b2b453b80049294c778d3446231b8cd3ac1c0f04c31a" Feb 27 18:07:01 crc kubenswrapper[4752]: E0227 18:07:01.907821 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cm8wb_openshift-machine-config-operator(53ce186c-640f-4ade-94e1-587c1440fe87)\"" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" Feb 27 18:07:01 crc kubenswrapper[4752]: E0227 18:07:01.911009 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:0f668226ec5fdc1726e9df3bb807b172040b59313117c8cbed8ade8e730a2225\\\"\"" pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" podUID="5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0" Feb 27 18:07:06 crc kubenswrapper[4752]: E0227 18:07:06.908317 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"webhook-server\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9\\\"\"" pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" podUID="d07bbb76-6065-4740-a3d5-45d511110e25" Feb 27 18:07:15 crc kubenswrapper[4752]: E0227 18:07:15.467888 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/metallb-rhel9-operator@sha256=5362b055219db524ad33db3122acfcf8b562f9fc2386ca5d1176cb2e4a29ec82/signature-11: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:0f668226ec5fdc1726e9df3bb807b172040b59313117c8cbed8ade8e730a2225" Feb 27 18:07:15 crc kubenswrapper[4752]: E0227 18:07:15.469372 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:0f668226ec5fdc1726e9df3bb807b172040b59313117c8cbed8ade8e730a2225,Command:[/manager],Args:[--enable-leader-election --disable-cert-rotation=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:webhook-server,HostPort:0,ContainerPort:9443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:SPEAKER_IMAGE,Value:registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9,ValueFrom:nil,},EnvVar{Name:CONTROLLER_IMAGE,Value:registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9,ValueFrom:nil,},EnvVar{Name:FRR_IMAGE,Value:registry.redhat.io/openshift4/frr-rhel9@sha256:d7e76e936159ed04e779a66d421cc3ecc6c82409e8eed924112d9174c3d6aad9,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:registry.redhat.io/openshift4/ose-kube-rbac-proxy-rhel9@sha256:787be45b5241419b6819676d43325a9030c0e16441918e4a33a44f0380d6b902,ValueFrom:nil,},EnvVar{Name:DEPLOY_KUBE_RBAC_PROXIES,Value:true,ValueFrom:nil,},EnvVar{Name:FRRK8S_IMAGE,Value:registry.redhat.io/openshift4/frr-rhel9@sha256:d7e76e936159ed04e779a66d421cc3ecc6c82409e8eed924112d9174c3d6aad9,ValueFrom:nil,},EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:DEPLOY_PODMONITORS,Value:false,ValueFrom:nil,},EnvVar{Name:DEPLOY_SERVICEMONITORS,Value:true,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOK,Value:true,ValueFrom:nil,},EnvVar{Name:ENABLE_OPERATOR_WEBHOOK,Value:true,ValueFrom:nil,},EnvVar{Name:METRICS_PORT,Value:29150,ValueFrom:nil,},EnvVar{Name:HTTPS_METRICS_PORT,Value:9120,ValueFrom:nil,},EnvVar{Name:FRR_METRICS_PORT,Value:29151,ValueFrom:nil,},EnvVar{Name:FRR_HTTPS_METRICS_PORT,Value:9121,ValueFrom:nil,},EnvVar{Name:MEMBER_LIST_BIND_PORT,Value:9122,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:metallb-operator.v4.18.0-202602140741,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hxltq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000700000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod metallb-operator-controller-manager-55b4885b68-sg4ds_metallb-system(5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/metallb-rhel9-operator@sha256=5362b055219db524ad33db3122acfcf8b562f9fc2386ca5d1176cb2e4a29ec82/signature-11: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 18:07:15 crc kubenswrapper[4752]: E0227 18:07:15.470658 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/metallb-rhel9-operator@sha256=5362b055219db524ad33db3122acfcf8b562f9fc2386ca5d1176cb2e4a29ec82/signature-11: status 500 (Internal Server Error)\"" pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" podUID="5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0" Feb 27 18:07:15 crc kubenswrapper[4752]: I0227 18:07:15.907315 4752 scope.go:117] "RemoveContainer" containerID="2d15bd23de76affad936b2b453b80049294c778d3446231b8cd3ac1c0f04c31a" Feb 27 18:07:15 crc kubenswrapper[4752]: E0227 18:07:15.909975 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536926-zvlsw" podUID="d054e9b4-5554-4da5-a57b-f834ad3fa861" Feb 27 18:07:17 crc kubenswrapper[4752]: I0227 18:07:17.191130 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" event={"ID":"53ce186c-640f-4ade-94e1-587c1440fe87","Type":"ContainerStarted","Data":"854c3bc3b9416613e89f531087b95ec91a6b133877f6a495839ae5cf6f7f45bb"} Feb 27 18:07:21 crc kubenswrapper[4752]: E0227 18:07:21.910063 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"webhook-server\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9\\\"\"" pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" podUID="d07bbb76-6065-4740-a3d5-45d511110e25" Feb 27 18:07:26 crc kubenswrapper[4752]: I0227 18:07:26.257271 4752 generic.go:334] "Generic (PLEG): container finished" podID="98d4bc78-9907-4c63-8aa5-77cbe7c87733" containerID="741b4f7e122905403da4108091477c3422a11336be8b0e8c09b35250cd3ba3ad" exitCode=0 Feb 27 18:07:26 crc kubenswrapper[4752]: I0227 18:07:26.257413 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-snmz2/must-gather-k86nf" event={"ID":"98d4bc78-9907-4c63-8aa5-77cbe7c87733","Type":"ContainerDied","Data":"741b4f7e122905403da4108091477c3422a11336be8b0e8c09b35250cd3ba3ad"} Feb 27 18:07:26 crc kubenswrapper[4752]: I0227 18:07:26.258096 4752 scope.go:117] "RemoveContainer" containerID="741b4f7e122905403da4108091477c3422a11336be8b0e8c09b35250cd3ba3ad" Feb 27 18:07:26 crc kubenswrapper[4752]: I0227 18:07:26.350053 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-snmz2_must-gather-k86nf_98d4bc78-9907-4c63-8aa5-77cbe7c87733/gather/0.log" Feb 27 18:07:28 crc kubenswrapper[4752]: E0227 18:07:28.008034 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 27 18:07:28 crc kubenswrapper[4752]: E0227 18:07:28.008300 4752 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 18:07:28 crc kubenswrapper[4752]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 27 18:07:28 crc kubenswrapper[4752]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g49xh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29536926-zvlsw_openshift-infra(d054e9b4-5554-4da5-a57b-f834ad3fa861): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error) Feb 27 18:07:28 crc kubenswrapper[4752]: > logger="UnhandledError" Feb 27 18:07:28 crc kubenswrapper[4752]: E0227 18:07:28.009903 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)\"" pod="openshift-infra/auto-csr-approver-29536926-zvlsw" podUID="d054e9b4-5554-4da5-a57b-f834ad3fa861" Feb 27 18:07:28 crc kubenswrapper[4752]: E0227 18:07:28.908876 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:0f668226ec5fdc1726e9df3bb807b172040b59313117c8cbed8ade8e730a2225\\\"\"" pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" podUID="5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0" Feb 27 18:07:33 crc kubenswrapper[4752]: I0227 18:07:33.249482 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-snmz2/must-gather-k86nf"] Feb 27 18:07:33 crc kubenswrapper[4752]: I0227 18:07:33.250482 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-snmz2/must-gather-k86nf" podUID="98d4bc78-9907-4c63-8aa5-77cbe7c87733" containerName="copy" containerID="cri-o://ea3ee97c1191d332144e2f48736bb7b4efc9363035883dc74cd2c8b948148c45" gracePeriod=2 Feb 27 18:07:33 crc kubenswrapper[4752]: I0227 18:07:33.253065 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-snmz2/must-gather-k86nf"] Feb 27 18:07:33 crc kubenswrapper[4752]: I0227 18:07:33.672433 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-snmz2_must-gather-k86nf_98d4bc78-9907-4c63-8aa5-77cbe7c87733/copy/0.log" Feb 27 18:07:33 crc kubenswrapper[4752]: I0227 18:07:33.674780 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-snmz2/must-gather-k86nf" Feb 27 18:07:33 crc kubenswrapper[4752]: I0227 18:07:33.694813 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh8sk\" (UniqueName: \"kubernetes.io/projected/98d4bc78-9907-4c63-8aa5-77cbe7c87733-kube-api-access-nh8sk\") pod \"98d4bc78-9907-4c63-8aa5-77cbe7c87733\" (UID: \"98d4bc78-9907-4c63-8aa5-77cbe7c87733\") " Feb 27 18:07:33 crc kubenswrapper[4752]: I0227 18:07:33.694882 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/98d4bc78-9907-4c63-8aa5-77cbe7c87733-must-gather-output\") pod \"98d4bc78-9907-4c63-8aa5-77cbe7c87733\" (UID: \"98d4bc78-9907-4c63-8aa5-77cbe7c87733\") " Feb 27 18:07:33 crc kubenswrapper[4752]: I0227 18:07:33.702779 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98d4bc78-9907-4c63-8aa5-77cbe7c87733-kube-api-access-nh8sk" (OuterVolumeSpecName: "kube-api-access-nh8sk") pod "98d4bc78-9907-4c63-8aa5-77cbe7c87733" (UID: "98d4bc78-9907-4c63-8aa5-77cbe7c87733"). InnerVolumeSpecName "kube-api-access-nh8sk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 18:07:33 crc kubenswrapper[4752]: I0227 18:07:33.760638 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98d4bc78-9907-4c63-8aa5-77cbe7c87733-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "98d4bc78-9907-4c63-8aa5-77cbe7c87733" (UID: "98d4bc78-9907-4c63-8aa5-77cbe7c87733"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 27 18:07:33 crc kubenswrapper[4752]: I0227 18:07:33.796463 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh8sk\" (UniqueName: \"kubernetes.io/projected/98d4bc78-9907-4c63-8aa5-77cbe7c87733-kube-api-access-nh8sk\") on node \"crc\" DevicePath \"\"" Feb 27 18:07:33 crc kubenswrapper[4752]: I0227 18:07:33.796496 4752 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/98d4bc78-9907-4c63-8aa5-77cbe7c87733-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 27 18:07:34 crc kubenswrapper[4752]: I0227 18:07:34.314063 4752 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-snmz2_must-gather-k86nf_98d4bc78-9907-4c63-8aa5-77cbe7c87733/copy/0.log" Feb 27 18:07:34 crc kubenswrapper[4752]: I0227 18:07:34.316186 4752 generic.go:334] "Generic (PLEG): container finished" podID="98d4bc78-9907-4c63-8aa5-77cbe7c87733" containerID="ea3ee97c1191d332144e2f48736bb7b4efc9363035883dc74cd2c8b948148c45" exitCode=143 Feb 27 18:07:34 crc kubenswrapper[4752]: I0227 18:07:34.316268 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-snmz2/must-gather-k86nf" Feb 27 18:07:34 crc kubenswrapper[4752]: I0227 18:07:34.316347 4752 scope.go:117] "RemoveContainer" containerID="ea3ee97c1191d332144e2f48736bb7b4efc9363035883dc74cd2c8b948148c45" Feb 27 18:07:34 crc kubenswrapper[4752]: I0227 18:07:34.342995 4752 scope.go:117] "RemoveContainer" containerID="741b4f7e122905403da4108091477c3422a11336be8b0e8c09b35250cd3ba3ad" Feb 27 18:07:34 crc kubenswrapper[4752]: I0227 18:07:34.391226 4752 scope.go:117] "RemoveContainer" containerID="ea3ee97c1191d332144e2f48736bb7b4efc9363035883dc74cd2c8b948148c45" Feb 27 18:07:34 crc kubenswrapper[4752]: E0227 18:07:34.391984 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea3ee97c1191d332144e2f48736bb7b4efc9363035883dc74cd2c8b948148c45\": container with ID starting with ea3ee97c1191d332144e2f48736bb7b4efc9363035883dc74cd2c8b948148c45 not found: ID does not exist" containerID="ea3ee97c1191d332144e2f48736bb7b4efc9363035883dc74cd2c8b948148c45" Feb 27 18:07:34 crc kubenswrapper[4752]: I0227 18:07:34.392041 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea3ee97c1191d332144e2f48736bb7b4efc9363035883dc74cd2c8b948148c45"} err="failed to get container status \"ea3ee97c1191d332144e2f48736bb7b4efc9363035883dc74cd2c8b948148c45\": rpc error: code = NotFound desc = could not find container \"ea3ee97c1191d332144e2f48736bb7b4efc9363035883dc74cd2c8b948148c45\": container with ID starting with ea3ee97c1191d332144e2f48736bb7b4efc9363035883dc74cd2c8b948148c45 not found: ID does not exist" Feb 27 18:07:34 crc kubenswrapper[4752]: I0227 18:07:34.392069 4752 scope.go:117] "RemoveContainer" containerID="741b4f7e122905403da4108091477c3422a11336be8b0e8c09b35250cd3ba3ad" Feb 27 18:07:34 crc kubenswrapper[4752]: E0227 18:07:34.392432 4752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"741b4f7e122905403da4108091477c3422a11336be8b0e8c09b35250cd3ba3ad\": container with ID starting with 741b4f7e122905403da4108091477c3422a11336be8b0e8c09b35250cd3ba3ad not found: ID does not exist" containerID="741b4f7e122905403da4108091477c3422a11336be8b0e8c09b35250cd3ba3ad" Feb 27 18:07:34 crc kubenswrapper[4752]: I0227 18:07:34.392510 4752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"741b4f7e122905403da4108091477c3422a11336be8b0e8c09b35250cd3ba3ad"} err="failed to get container status \"741b4f7e122905403da4108091477c3422a11336be8b0e8c09b35250cd3ba3ad\": rpc error: code = NotFound desc = could not find container \"741b4f7e122905403da4108091477c3422a11336be8b0e8c09b35250cd3ba3ad\": container with ID starting with 741b4f7e122905403da4108091477c3422a11336be8b0e8c09b35250cd3ba3ad not found: ID does not exist" Feb 27 18:07:34 crc kubenswrapper[4752]: I0227 18:07:34.922611 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98d4bc78-9907-4c63-8aa5-77cbe7c87733" path="/var/lib/kubelet/pods/98d4bc78-9907-4c63-8aa5-77cbe7c87733/volumes" Feb 27 18:07:35 crc kubenswrapper[4752]: E0227 18:07:35.597421 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/metallb-rhel9@sha256=6fd8cc1924db80d00ff2e105c8dcfcb466bc6a501b46d78c6da38a2a6f4f60a1/signature-11: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9" Feb 27 18:07:35 crc kubenswrapper[4752]: E0227 18:07:35.598138 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:webhook-server,Image:registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9,Command:[/controller],Args:[--disable-cert-rotation=true --port=7472 --log-level=info --webhook-mode=onlywebhook],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:monitoring,HostPort:0,ContainerPort:7472,Protocol:TCP,HostIP:,},ContainerPort{Name:webhook-server,HostPort:0,ContainerPort:9443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:METALLB_BGP_TYPE,Value:frr,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:metallb-operator.v4.18.0-202602140741,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2gqbs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/metrics,Port:{1 0 monitoring},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/metrics,Port:{1 0 monitoring},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000700000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod metallb-operator-webhook-server-7f65d79b55-6j6sn_metallb-system(d07bbb76-6065-4740-a3d5-45d511110e25): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/metallb-rhel9@sha256=6fd8cc1924db80d00ff2e105c8dcfcb466bc6a501b46d78c6da38a2a6f4f60a1/signature-11: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 18:07:35 crc kubenswrapper[4752]: E0227 18:07:35.599528 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"webhook-server\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/metallb-rhel9@sha256=6fd8cc1924db80d00ff2e105c8dcfcb466bc6a501b46d78c6da38a2a6f4f60a1/signature-11: status 500 (Internal Server Error)\"" pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" podUID="d07bbb76-6065-4740-a3d5-45d511110e25" Feb 27 18:07:39 crc kubenswrapper[4752]: E0227 18:07:39.908593 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536926-zvlsw" podUID="d054e9b4-5554-4da5-a57b-f834ad3fa861" Feb 27 18:07:43 crc kubenswrapper[4752]: E0227 18:07:43.910923 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:0f668226ec5fdc1726e9df3bb807b172040b59313117c8cbed8ade8e730a2225\\\"\"" pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" podUID="5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0" Feb 27 18:07:46 crc kubenswrapper[4752]: E0227 18:07:46.910933 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"webhook-server\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9\\\"\"" pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" podUID="d07bbb76-6065-4740-a3d5-45d511110e25" Feb 27 18:07:51 crc kubenswrapper[4752]: E0227 18:07:51.908899 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536926-zvlsw" podUID="d054e9b4-5554-4da5-a57b-f834ad3fa861" Feb 27 18:07:56 crc kubenswrapper[4752]: E0227 18:07:56.909019 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:0f668226ec5fdc1726e9df3bb807b172040b59313117c8cbed8ade8e730a2225\\\"\"" pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" podUID="5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0" Feb 27 18:07:58 crc kubenswrapper[4752]: E0227 18:07:58.909127 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"webhook-server\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9\\\"\"" pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" podUID="d07bbb76-6065-4740-a3d5-45d511110e25" Feb 27 18:08:00 crc kubenswrapper[4752]: I0227 18:08:00.150008 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536928-xls6k"] Feb 27 18:08:00 crc kubenswrapper[4752]: E0227 18:08:00.150269 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98d4bc78-9907-4c63-8aa5-77cbe7c87733" containerName="copy" Feb 27 18:08:00 crc kubenswrapper[4752]: I0227 18:08:00.150281 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="98d4bc78-9907-4c63-8aa5-77cbe7c87733" containerName="copy" Feb 27 18:08:00 crc kubenswrapper[4752]: E0227 18:08:00.150289 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98d4bc78-9907-4c63-8aa5-77cbe7c87733" containerName="gather" Feb 27 18:08:00 crc kubenswrapper[4752]: I0227 18:08:00.150295 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="98d4bc78-9907-4c63-8aa5-77cbe7c87733" containerName="gather" Feb 27 18:08:00 crc kubenswrapper[4752]: I0227 18:08:00.150391 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="98d4bc78-9907-4c63-8aa5-77cbe7c87733" containerName="gather" Feb 27 18:08:00 crc kubenswrapper[4752]: I0227 18:08:00.150410 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="98d4bc78-9907-4c63-8aa5-77cbe7c87733" containerName="copy" Feb 27 18:08:00 crc kubenswrapper[4752]: I0227 18:08:00.150831 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536928-xls6k" Feb 27 18:08:00 crc kubenswrapper[4752]: I0227 18:08:00.157567 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536928-xls6k"] Feb 27 18:08:00 crc kubenswrapper[4752]: I0227 18:08:00.264863 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzjpr\" (UniqueName: \"kubernetes.io/projected/7a6923f8-2095-4879-a6a6-629b073ba504-kube-api-access-pzjpr\") pod \"auto-csr-approver-29536928-xls6k\" (UID: \"7a6923f8-2095-4879-a6a6-629b073ba504\") " pod="openshift-infra/auto-csr-approver-29536928-xls6k" Feb 27 18:08:00 crc kubenswrapper[4752]: I0227 18:08:00.366356 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzjpr\" (UniqueName: \"kubernetes.io/projected/7a6923f8-2095-4879-a6a6-629b073ba504-kube-api-access-pzjpr\") pod \"auto-csr-approver-29536928-xls6k\" (UID: \"7a6923f8-2095-4879-a6a6-629b073ba504\") " pod="openshift-infra/auto-csr-approver-29536928-xls6k" Feb 27 18:08:00 crc kubenswrapper[4752]: I0227 18:08:00.403385 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzjpr\" (UniqueName: \"kubernetes.io/projected/7a6923f8-2095-4879-a6a6-629b073ba504-kube-api-access-pzjpr\") pod \"auto-csr-approver-29536928-xls6k\" (UID: \"7a6923f8-2095-4879-a6a6-629b073ba504\") " pod="openshift-infra/auto-csr-approver-29536928-xls6k" Feb 27 18:08:00 crc kubenswrapper[4752]: I0227 18:08:00.486066 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536928-xls6k" Feb 27 18:08:00 crc kubenswrapper[4752]: I0227 18:08:00.749188 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536928-xls6k"] Feb 27 18:08:00 crc kubenswrapper[4752]: W0227 18:08:00.768396 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a6923f8_2095_4879_a6a6_629b073ba504.slice/crio-46f7a667694a7e764e8204daeb1fd620788c9e17c951fd33c1f0f2b0dc7df034 WatchSource:0}: Error finding container 46f7a667694a7e764e8204daeb1fd620788c9e17c951fd33c1f0f2b0dc7df034: Status 404 returned error can't find the container with id 46f7a667694a7e764e8204daeb1fd620788c9e17c951fd33c1f0f2b0dc7df034 Feb 27 18:08:01 crc kubenswrapper[4752]: I0227 18:08:01.510837 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536928-xls6k" event={"ID":"7a6923f8-2095-4879-a6a6-629b073ba504","Type":"ContainerStarted","Data":"46f7a667694a7e764e8204daeb1fd620788c9e17c951fd33c1f0f2b0dc7df034"} Feb 27 18:08:02 crc kubenswrapper[4752]: E0227 18:08:02.288349 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 27 18:08:02 crc kubenswrapper[4752]: E0227 18:08:02.288628 4752 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 18:08:02 crc kubenswrapper[4752]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 27 18:08:02 crc kubenswrapper[4752]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pzjpr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29536928-xls6k_openshift-infra(7a6923f8-2095-4879-a6a6-629b073ba504): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error) Feb 27 18:08:02 crc kubenswrapper[4752]: > logger="UnhandledError" Feb 27 18:08:02 crc kubenswrapper[4752]: E0227 18:08:02.290055 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)\"" pod="openshift-infra/auto-csr-approver-29536928-xls6k" podUID="7a6923f8-2095-4879-a6a6-629b073ba504" Feb 27 18:08:02 crc kubenswrapper[4752]: E0227 18:08:02.520238 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536928-xls6k" podUID="7a6923f8-2095-4879-a6a6-629b073ba504" Feb 27 18:08:03 crc kubenswrapper[4752]: E0227 18:08:03.907090 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536926-zvlsw" podUID="d054e9b4-5554-4da5-a57b-f834ad3fa861" Feb 27 18:08:10 crc kubenswrapper[4752]: E0227 18:08:10.912848 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:0f668226ec5fdc1726e9df3bb807b172040b59313117c8cbed8ade8e730a2225\\\"\"" pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" podUID="5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0" Feb 27 18:08:12 crc kubenswrapper[4752]: E0227 18:08:12.909856 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"webhook-server\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9\\\"\"" pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" podUID="d07bbb76-6065-4740-a3d5-45d511110e25" Feb 27 18:08:15 crc kubenswrapper[4752]: E0227 18:08:15.909459 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536926-zvlsw" podUID="d054e9b4-5554-4da5-a57b-f834ad3fa861" Feb 27 18:08:16 crc kubenswrapper[4752]: I0227 18:08:16.613507 4752 generic.go:334] "Generic (PLEG): container finished" podID="7a6923f8-2095-4879-a6a6-629b073ba504" containerID="0d21ca5ad9262dd93b2b909c856c824037a1c97c96d98cf230d34f1c444a0de2" exitCode=0 Feb 27 18:08:16 crc kubenswrapper[4752]: I0227 18:08:16.613622 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536928-xls6k" event={"ID":"7a6923f8-2095-4879-a6a6-629b073ba504","Type":"ContainerDied","Data":"0d21ca5ad9262dd93b2b909c856c824037a1c97c96d98cf230d34f1c444a0de2"} Feb 27 18:08:17 crc kubenswrapper[4752]: I0227 18:08:17.910749 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536928-xls6k" Feb 27 18:08:18 crc kubenswrapper[4752]: I0227 18:08:18.022108 4752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzjpr\" (UniqueName: \"kubernetes.io/projected/7a6923f8-2095-4879-a6a6-629b073ba504-kube-api-access-pzjpr\") pod \"7a6923f8-2095-4879-a6a6-629b073ba504\" (UID: \"7a6923f8-2095-4879-a6a6-629b073ba504\") " Feb 27 18:08:18 crc kubenswrapper[4752]: I0227 18:08:18.031721 4752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a6923f8-2095-4879-a6a6-629b073ba504-kube-api-access-pzjpr" (OuterVolumeSpecName: "kube-api-access-pzjpr") pod "7a6923f8-2095-4879-a6a6-629b073ba504" (UID: "7a6923f8-2095-4879-a6a6-629b073ba504"). InnerVolumeSpecName "kube-api-access-pzjpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 27 18:08:18 crc kubenswrapper[4752]: I0227 18:08:18.123737 4752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzjpr\" (UniqueName: \"kubernetes.io/projected/7a6923f8-2095-4879-a6a6-629b073ba504-kube-api-access-pzjpr\") on node \"crc\" DevicePath \"\"" Feb 27 18:08:18 crc kubenswrapper[4752]: I0227 18:08:18.633509 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536928-xls6k" event={"ID":"7a6923f8-2095-4879-a6a6-629b073ba504","Type":"ContainerDied","Data":"46f7a667694a7e764e8204daeb1fd620788c9e17c951fd33c1f0f2b0dc7df034"} Feb 27 18:08:18 crc kubenswrapper[4752]: I0227 18:08:18.633563 4752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46f7a667694a7e764e8204daeb1fd620788c9e17c951fd33c1f0f2b0dc7df034" Feb 27 18:08:18 crc kubenswrapper[4752]: I0227 18:08:18.634039 4752 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536928-xls6k" Feb 27 18:08:18 crc kubenswrapper[4752]: I0227 18:08:18.973762 4752 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29536920-tkl55"] Feb 27 18:08:18 crc kubenswrapper[4752]: I0227 18:08:18.978007 4752 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29536920-tkl55"] Feb 27 18:08:20 crc kubenswrapper[4752]: I0227 18:08:20.918998 4752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="395fd68e-6845-4875-b0a0-de3c408b5fca" path="/var/lib/kubelet/pods/395fd68e-6845-4875-b0a0-de3c408b5fca/volumes" Feb 27 18:08:22 crc kubenswrapper[4752]: E0227 18:08:22.910197 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:0f668226ec5fdc1726e9df3bb807b172040b59313117c8cbed8ade8e730a2225\\\"\"" pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" podUID="5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0" Feb 27 18:08:25 crc kubenswrapper[4752]: E0227 18:08:25.908316 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"webhook-server\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9\\\"\"" pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" podUID="d07bbb76-6065-4740-a3d5-45d511110e25" Feb 27 18:08:26 crc kubenswrapper[4752]: E0227 18:08:26.909508 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536926-zvlsw" podUID="d054e9b4-5554-4da5-a57b-f834ad3fa861" Feb 27 18:08:34 crc kubenswrapper[4752]: E0227 18:08:34.909602 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:0f668226ec5fdc1726e9df3bb807b172040b59313117c8cbed8ade8e730a2225\\\"\"" pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" podUID="5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0" Feb 27 18:08:38 crc kubenswrapper[4752]: E0227 18:08:38.908729 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"webhook-server\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9\\\"\"" pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" podUID="d07bbb76-6065-4740-a3d5-45d511110e25" Feb 27 18:08:40 crc kubenswrapper[4752]: E0227 18:08:40.914981 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536926-zvlsw" podUID="d054e9b4-5554-4da5-a57b-f834ad3fa861" Feb 27 18:08:47 crc kubenswrapper[4752]: E0227 18:08:47.908304 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:0f668226ec5fdc1726e9df3bb807b172040b59313117c8cbed8ade8e730a2225\\\"\"" pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" podUID="5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0" Feb 27 18:08:53 crc kubenswrapper[4752]: E0227 18:08:53.908381 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"webhook-server\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9\\\"\"" pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" podUID="d07bbb76-6065-4740-a3d5-45d511110e25" Feb 27 18:08:54 crc kubenswrapper[4752]: E0227 18:08:54.140069 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 27 18:08:54 crc kubenswrapper[4752]: E0227 18:08:54.140498 4752 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 18:08:54 crc kubenswrapper[4752]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 27 18:08:54 crc kubenswrapper[4752]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g49xh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29536926-zvlsw_openshift-infra(d054e9b4-5554-4da5-a57b-f834ad3fa861): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error) Feb 27 18:08:54 crc kubenswrapper[4752]: > logger="UnhandledError" Feb 27 18:08:54 crc kubenswrapper[4752]: E0227 18:08:54.141739 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)\"" pod="openshift-infra/auto-csr-approver-29536926-zvlsw" podUID="d054e9b4-5554-4da5-a57b-f834ad3fa861" Feb 27 18:08:58 crc kubenswrapper[4752]: E0227 18:08:58.910573 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:0f668226ec5fdc1726e9df3bb807b172040b59313117c8cbed8ade8e730a2225\\\"\"" pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" podUID="5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0" Feb 27 18:09:03 crc kubenswrapper[4752]: I0227 18:09:03.932489 4752 scope.go:117] "RemoveContainer" containerID="f6718d423c5b5f9c3221eab8ee92e3b4f38624e012e229e14c7df83132795756" Feb 27 18:09:06 crc kubenswrapper[4752]: E0227 18:09:06.910368 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"webhook-server\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9\\\"\"" pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" podUID="d07bbb76-6065-4740-a3d5-45d511110e25" Feb 27 18:09:06 crc kubenswrapper[4752]: E0227 18:09:06.910407 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536926-zvlsw" podUID="d054e9b4-5554-4da5-a57b-f834ad3fa861" Feb 27 18:09:10 crc kubenswrapper[4752]: E0227 18:09:10.915018 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:0f668226ec5fdc1726e9df3bb807b172040b59313117c8cbed8ade8e730a2225\\\"\"" pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" podUID="5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0" Feb 27 18:09:17 crc kubenswrapper[4752]: E0227 18:09:17.907861 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"webhook-server\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9\\\"\"" pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" podUID="d07bbb76-6065-4740-a3d5-45d511110e25" Feb 27 18:09:21 crc kubenswrapper[4752]: E0227 18:09:21.908244 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536926-zvlsw" podUID="d054e9b4-5554-4da5-a57b-f834ad3fa861" Feb 27 18:09:23 crc kubenswrapper[4752]: E0227 18:09:23.909081 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:0f668226ec5fdc1726e9df3bb807b172040b59313117c8cbed8ade8e730a2225\\\"\"" pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" podUID="5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0" Feb 27 18:09:31 crc kubenswrapper[4752]: E0227 18:09:31.911611 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"webhook-server\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9\\\"\"" pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" podUID="d07bbb76-6065-4740-a3d5-45d511110e25" Feb 27 18:09:33 crc kubenswrapper[4752]: E0227 18:09:33.909329 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536926-zvlsw" podUID="d054e9b4-5554-4da5-a57b-f834ad3fa861" Feb 27 18:09:34 crc kubenswrapper[4752]: E0227 18:09:34.908449 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:0f668226ec5fdc1726e9df3bb807b172040b59313117c8cbed8ade8e730a2225\\\"\"" pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" podUID="5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0" Feb 27 18:09:36 crc kubenswrapper[4752]: I0227 18:09:36.323740 4752 patch_prober.go:28] interesting pod/machine-config-daemon-cm8wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 18:09:36 crc kubenswrapper[4752]: I0227 18:09:36.323834 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 18:09:42 crc kubenswrapper[4752]: E0227 18:09:42.910466 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"webhook-server\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9\\\"\"" pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" podUID="d07bbb76-6065-4740-a3d5-45d511110e25" Feb 27 18:09:44 crc kubenswrapper[4752]: E0227 18:09:44.909269 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536926-zvlsw" podUID="d054e9b4-5554-4da5-a57b-f834ad3fa861" Feb 27 18:09:46 crc kubenswrapper[4752]: E0227 18:09:46.909711 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:0f668226ec5fdc1726e9df3bb807b172040b59313117c8cbed8ade8e730a2225\\\"\"" pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" podUID="5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0" Feb 27 18:09:54 crc kubenswrapper[4752]: E0227 18:09:54.909647 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"webhook-server\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9\\\"\"" pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" podUID="d07bbb76-6065-4740-a3d5-45d511110e25" Feb 27 18:09:57 crc kubenswrapper[4752]: E0227 18:09:57.908674 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536926-zvlsw" podUID="d054e9b4-5554-4da5-a57b-f834ad3fa861" Feb 27 18:10:00 crc kubenswrapper[4752]: I0227 18:10:00.164884 4752 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29536930-blhls"] Feb 27 18:10:00 crc kubenswrapper[4752]: E0227 18:10:00.165558 4752 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a6923f8-2095-4879-a6a6-629b073ba504" containerName="oc" Feb 27 18:10:00 crc kubenswrapper[4752]: I0227 18:10:00.165574 4752 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a6923f8-2095-4879-a6a6-629b073ba504" containerName="oc" Feb 27 18:10:00 crc kubenswrapper[4752]: I0227 18:10:00.165855 4752 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a6923f8-2095-4879-a6a6-629b073ba504" containerName="oc" Feb 27 18:10:00 crc kubenswrapper[4752]: I0227 18:10:00.166524 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536930-blhls" Feb 27 18:10:00 crc kubenswrapper[4752]: I0227 18:10:00.194310 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536930-blhls"] Feb 27 18:10:00 crc kubenswrapper[4752]: I0227 18:10:00.289572 4752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5l9w\" (UniqueName: \"kubernetes.io/projected/24b7d3c2-d38e-4034-8402-e6597e931234-kube-api-access-n5l9w\") pod \"auto-csr-approver-29536930-blhls\" (UID: \"24b7d3c2-d38e-4034-8402-e6597e931234\") " pod="openshift-infra/auto-csr-approver-29536930-blhls" Feb 27 18:10:00 crc kubenswrapper[4752]: I0227 18:10:00.390708 4752 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5l9w\" (UniqueName: \"kubernetes.io/projected/24b7d3c2-d38e-4034-8402-e6597e931234-kube-api-access-n5l9w\") pod \"auto-csr-approver-29536930-blhls\" (UID: \"24b7d3c2-d38e-4034-8402-e6597e931234\") " pod="openshift-infra/auto-csr-approver-29536930-blhls" Feb 27 18:10:00 crc kubenswrapper[4752]: I0227 18:10:00.417722 4752 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5l9w\" (UniqueName: \"kubernetes.io/projected/24b7d3c2-d38e-4034-8402-e6597e931234-kube-api-access-n5l9w\") pod \"auto-csr-approver-29536930-blhls\" (UID: \"24b7d3c2-d38e-4034-8402-e6597e931234\") " pod="openshift-infra/auto-csr-approver-29536930-blhls" Feb 27 18:10:00 crc kubenswrapper[4752]: I0227 18:10:00.498171 4752 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29536930-blhls" Feb 27 18:10:00 crc kubenswrapper[4752]: I0227 18:10:00.775025 4752 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29536930-blhls"] Feb 27 18:10:00 crc kubenswrapper[4752]: W0227 18:10:00.782664 4752 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24b7d3c2_d38e_4034_8402_e6597e931234.slice/crio-525227c5f0f9c498ee62c96bd838b5358b866bb023a5b67362a57da5857ffd25 WatchSource:0}: Error finding container 525227c5f0f9c498ee62c96bd838b5358b866bb023a5b67362a57da5857ffd25: Status 404 returned error can't find the container with id 525227c5f0f9c498ee62c96bd838b5358b866bb023a5b67362a57da5857ffd25 Feb 27 18:10:01 crc kubenswrapper[4752]: I0227 18:10:01.395032 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29536930-blhls" event={"ID":"24b7d3c2-d38e-4034-8402-e6597e931234","Type":"ContainerStarted","Data":"525227c5f0f9c498ee62c96bd838b5358b866bb023a5b67362a57da5857ffd25"} Feb 27 18:10:02 crc kubenswrapper[4752]: E0227 18:10:02.053102 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 27 18:10:02 crc kubenswrapper[4752]: E0227 18:10:02.053332 4752 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 18:10:02 crc kubenswrapper[4752]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 27 18:10:02 crc kubenswrapper[4752]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-n5l9w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29536930-blhls_openshift-infra(24b7d3c2-d38e-4034-8402-e6597e931234): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error) Feb 27 18:10:02 crc kubenswrapper[4752]: > logger="UnhandledError" Feb 27 18:10:02 crc kubenswrapper[4752]: E0227 18:10:02.054581 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)\"" pod="openshift-infra/auto-csr-approver-29536930-blhls" podUID="24b7d3c2-d38e-4034-8402-e6597e931234" Feb 27 18:10:02 crc kubenswrapper[4752]: E0227 18:10:02.402883 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536930-blhls" podUID="24b7d3c2-d38e-4034-8402-e6597e931234" Feb 27 18:10:02 crc kubenswrapper[4752]: E0227 18:10:02.536835 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/metallb-rhel9-operator@sha256=5362b055219db524ad33db3122acfcf8b562f9fc2386ca5d1176cb2e4a29ec82/signature-11: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:0f668226ec5fdc1726e9df3bb807b172040b59313117c8cbed8ade8e730a2225" Feb 27 18:10:02 crc kubenswrapper[4752]: E0227 18:10:02.537038 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:0f668226ec5fdc1726e9df3bb807b172040b59313117c8cbed8ade8e730a2225,Command:[/manager],Args:[--enable-leader-election --disable-cert-rotation=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:webhook-server,HostPort:0,ContainerPort:9443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:SPEAKER_IMAGE,Value:registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9,ValueFrom:nil,},EnvVar{Name:CONTROLLER_IMAGE,Value:registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9,ValueFrom:nil,},EnvVar{Name:FRR_IMAGE,Value:registry.redhat.io/openshift4/frr-rhel9@sha256:d7e76e936159ed04e779a66d421cc3ecc6c82409e8eed924112d9174c3d6aad9,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:registry.redhat.io/openshift4/ose-kube-rbac-proxy-rhel9@sha256:787be45b5241419b6819676d43325a9030c0e16441918e4a33a44f0380d6b902,ValueFrom:nil,},EnvVar{Name:DEPLOY_KUBE_RBAC_PROXIES,Value:true,ValueFrom:nil,},EnvVar{Name:FRRK8S_IMAGE,Value:registry.redhat.io/openshift4/frr-rhel9@sha256:d7e76e936159ed04e779a66d421cc3ecc6c82409e8eed924112d9174c3d6aad9,ValueFrom:nil,},EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:DEPLOY_PODMONITORS,Value:false,ValueFrom:nil,},EnvVar{Name:DEPLOY_SERVICEMONITORS,Value:true,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOK,Value:true,ValueFrom:nil,},EnvVar{Name:ENABLE_OPERATOR_WEBHOOK,Value:true,ValueFrom:nil,},EnvVar{Name:METRICS_PORT,Value:29150,ValueFrom:nil,},EnvVar{Name:HTTPS_METRICS_PORT,Value:9120,ValueFrom:nil,},EnvVar{Name:FRR_METRICS_PORT,Value:29151,ValueFrom:nil,},EnvVar{Name:FRR_HTTPS_METRICS_PORT,Value:9121,ValueFrom:nil,},EnvVar{Name:MEMBER_LIST_BIND_PORT,Value:9122,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:metallb-operator.v4.18.0-202602140741,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hxltq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000700000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod metallb-operator-controller-manager-55b4885b68-sg4ds_metallb-system(5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/metallb-rhel9-operator@sha256=5362b055219db524ad33db3122acfcf8b562f9fc2386ca5d1176cb2e4a29ec82/signature-11: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 18:10:02 crc kubenswrapper[4752]: E0227 18:10:02.538225 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/metallb-rhel9-operator@sha256=5362b055219db524ad33db3122acfcf8b562f9fc2386ca5d1176cb2e4a29ec82/signature-11: status 500 (Internal Server Error)\"" pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" podUID="5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0" Feb 27 18:10:06 crc kubenswrapper[4752]: I0227 18:10:06.324034 4752 patch_prober.go:28] interesting pod/machine-config-daemon-cm8wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 18:10:06 crc kubenswrapper[4752]: I0227 18:10:06.324436 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 18:10:07 crc kubenswrapper[4752]: E0227 18:10:07.909842 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"webhook-server\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9\\\"\"" pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" podUID="d07bbb76-6065-4740-a3d5-45d511110e25" Feb 27 18:10:12 crc kubenswrapper[4752]: E0227 18:10:12.908640 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536926-zvlsw" podUID="d054e9b4-5554-4da5-a57b-f834ad3fa861" Feb 27 18:10:14 crc kubenswrapper[4752]: E0227 18:10:14.908677 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:0f668226ec5fdc1726e9df3bb807b172040b59313117c8cbed8ade8e730a2225\\\"\"" pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" podUID="5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0" Feb 27 18:10:17 crc kubenswrapper[4752]: E0227 18:10:17.015554 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 27 18:10:17 crc kubenswrapper[4752]: E0227 18:10:17.015742 4752 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 18:10:17 crc kubenswrapper[4752]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 27 18:10:17 crc kubenswrapper[4752]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-n5l9w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29536930-blhls_openshift-infra(24b7d3c2-d38e-4034-8402-e6597e931234): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error) Feb 27 18:10:17 crc kubenswrapper[4752]: > logger="UnhandledError" Feb 27 18:10:17 crc kubenswrapper[4752]: E0227 18:10:17.016990 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)\"" pod="openshift-infra/auto-csr-approver-29536930-blhls" podUID="24b7d3c2-d38e-4034-8402-e6597e931234" Feb 27 18:10:20 crc kubenswrapper[4752]: E0227 18:10:20.174886 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/metallb-rhel9@sha256=6fd8cc1924db80d00ff2e105c8dcfcb466bc6a501b46d78c6da38a2a6f4f60a1/signature-11: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9" Feb 27 18:10:20 crc kubenswrapper[4752]: E0227 18:10:20.175629 4752 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:webhook-server,Image:registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9,Command:[/controller],Args:[--disable-cert-rotation=true --port=7472 --log-level=info --webhook-mode=onlywebhook],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:monitoring,HostPort:0,ContainerPort:7472,Protocol:TCP,HostIP:,},ContainerPort{Name:webhook-server,HostPort:0,ContainerPort:9443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:METALLB_BGP_TYPE,Value:frr,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:metallb-operator.v4.18.0-202602140741,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2gqbs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/metrics,Port:{1 0 monitoring},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/metrics,Port:{1 0 monitoring},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000700000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod metallb-operator-webhook-server-7f65d79b55-6j6sn_metallb-system(d07bbb76-6065-4740-a3d5-45d511110e25): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/metallb-rhel9@sha256=6fd8cc1924db80d00ff2e105c8dcfcb466bc6a501b46d78c6da38a2a6f4f60a1/signature-11: status 500 (Internal Server Error)" logger="UnhandledError" Feb 27 18:10:20 crc kubenswrapper[4752]: E0227 18:10:20.176966 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"webhook-server\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/metallb-rhel9@sha256=6fd8cc1924db80d00ff2e105c8dcfcb466bc6a501b46d78c6da38a2a6f4f60a1/signature-11: status 500 (Internal Server Error)\"" pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" podUID="d07bbb76-6065-4740-a3d5-45d511110e25" Feb 27 18:10:23 crc kubenswrapper[4752]: E0227 18:10:23.912187 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536926-zvlsw" podUID="d054e9b4-5554-4da5-a57b-f834ad3fa861" Feb 27 18:10:28 crc kubenswrapper[4752]: E0227 18:10:28.909250 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536930-blhls" podUID="24b7d3c2-d38e-4034-8402-e6597e931234" Feb 27 18:10:28 crc kubenswrapper[4752]: E0227 18:10:28.909546 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:0f668226ec5fdc1726e9df3bb807b172040b59313117c8cbed8ade8e730a2225\\\"\"" pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" podUID="5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0" Feb 27 18:10:30 crc kubenswrapper[4752]: E0227 18:10:30.917502 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"webhook-server\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9\\\"\"" pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" podUID="d07bbb76-6065-4740-a3d5-45d511110e25" Feb 27 18:10:35 crc kubenswrapper[4752]: E0227 18:10:35.909414 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536926-zvlsw" podUID="d054e9b4-5554-4da5-a57b-f834ad3fa861" Feb 27 18:10:36 crc kubenswrapper[4752]: I0227 18:10:36.324120 4752 patch_prober.go:28] interesting pod/machine-config-daemon-cm8wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 27 18:10:36 crc kubenswrapper[4752]: I0227 18:10:36.324272 4752 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 27 18:10:36 crc kubenswrapper[4752]: I0227 18:10:36.324341 4752 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" Feb 27 18:10:36 crc kubenswrapper[4752]: I0227 18:10:36.325087 4752 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"854c3bc3b9416613e89f531087b95ec91a6b133877f6a495839ae5cf6f7f45bb"} pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 27 18:10:36 crc kubenswrapper[4752]: I0227 18:10:36.325241 4752 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" podUID="53ce186c-640f-4ade-94e1-587c1440fe87" containerName="machine-config-daemon" containerID="cri-o://854c3bc3b9416613e89f531087b95ec91a6b133877f6a495839ae5cf6f7f45bb" gracePeriod=600 Feb 27 18:10:36 crc kubenswrapper[4752]: I0227 18:10:36.665644 4752 generic.go:334] "Generic (PLEG): container finished" podID="53ce186c-640f-4ade-94e1-587c1440fe87" containerID="854c3bc3b9416613e89f531087b95ec91a6b133877f6a495839ae5cf6f7f45bb" exitCode=0 Feb 27 18:10:36 crc kubenswrapper[4752]: I0227 18:10:36.665717 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" event={"ID":"53ce186c-640f-4ade-94e1-587c1440fe87","Type":"ContainerDied","Data":"854c3bc3b9416613e89f531087b95ec91a6b133877f6a495839ae5cf6f7f45bb"} Feb 27 18:10:36 crc kubenswrapper[4752]: I0227 18:10:36.666405 4752 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cm8wb" event={"ID":"53ce186c-640f-4ade-94e1-587c1440fe87","Type":"ContainerStarted","Data":"0bc2971869c2b9398a7da38f02a080fa91ea4c94f3c1cf6958e54eb88e30bcd1"} Feb 27 18:10:36 crc kubenswrapper[4752]: I0227 18:10:36.666454 4752 scope.go:117] "RemoveContainer" containerID="2d15bd23de76affad936b2b453b80049294c778d3446231b8cd3ac1c0f04c31a" Feb 27 18:10:40 crc kubenswrapper[4752]: E0227 18:10:40.912131 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:0f668226ec5fdc1726e9df3bb807b172040b59313117c8cbed8ade8e730a2225\\\"\"" pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" podUID="5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0" Feb 27 18:10:42 crc kubenswrapper[4752]: E0227 18:10:42.908721 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"webhook-server\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9\\\"\"" pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" podUID="d07bbb76-6065-4740-a3d5-45d511110e25" Feb 27 18:10:43 crc kubenswrapper[4752]: I0227 18:10:43.912421 4752 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 27 18:10:44 crc kubenswrapper[4752]: E0227 18:10:44.916719 4752 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 27 18:10:44 crc kubenswrapper[4752]: E0227 18:10:44.917206 4752 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 27 18:10:44 crc kubenswrapper[4752]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 27 18:10:44 crc kubenswrapper[4752]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-n5l9w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29536930-blhls_openshift-infra(24b7d3c2-d38e-4034-8402-e6597e931234): ErrImagePull: copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error) Feb 27 18:10:44 crc kubenswrapper[4752]: > logger="UnhandledError" Feb 27 18:10:44 crc kubenswrapper[4752]: E0227 18:10:44.918372 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"copying system image from manifest list: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/openshift4/ose-cli@sha256=69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9/signature-7: status 500 (Internal Server Error)\"" pod="openshift-infra/auto-csr-approver-29536930-blhls" podUID="24b7d3c2-d38e-4034-8402-e6597e931234" Feb 27 18:10:49 crc kubenswrapper[4752]: E0227 18:10:49.910808 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536926-zvlsw" podUID="d054e9b4-5554-4da5-a57b-f834ad3fa861" Feb 27 18:10:53 crc kubenswrapper[4752]: E0227 18:10:53.909193 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9-operator@sha256:0f668226ec5fdc1726e9df3bb807b172040b59313117c8cbed8ade8e730a2225\\\"\"" pod="metallb-system/metallb-operator-controller-manager-55b4885b68-sg4ds" podUID="5238f5ee-3ece-4ec6-a6d5-ddf9eabfe0b0" Feb 27 18:10:54 crc kubenswrapper[4752]: E0227 18:10:54.908985 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"webhook-server\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9@sha256:02d5ffcd04189eb7328b7a5f79ce5e4cdf09216f2560d702e61e63eb8e2588d9\\\"\"" pod="metallb-system/metallb-operator-webhook-server-7f65d79b55-6j6sn" podUID="d07bbb76-6065-4740-a3d5-45d511110e25" Feb 27 18:10:56 crc kubenswrapper[4752]: E0227 18:10:56.910537 4752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29536930-blhls" podUID="24b7d3c2-d38e-4034-8402-e6597e931234" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515150357076024456 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015150357076017373 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015150352370016506 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015150352371015457 5ustar corecore